Sep 06, 2025 12:40:58 AM org.apache.karaf.main.Main launch INFO: Installing and starting initial bundles Sep 06, 2025 12:40:58 AM org.apache.karaf.main.Main launch INFO: All initial bundles installed and set to start Sep 06, 2025 12:40:58 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Sep 06, 2025 12:40:58 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Sep 06, 2025 12:40:58 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-09-06T00:40:59,295 | INFO | CM Configuration Updater (Update: pid=org.ops4j.pax.logging) | EventAdminConfigurationNotifier | 4 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.2.8 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-09-06T00:41:00,451 | INFO | activator-1-thread-2 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Adding features: odl-openflowplugin-flow-services-rest/[0.20.0,0.20.0],odl-openflowplugin-app-bulk-o-matic/[0.20.0,0.20.0],8dd284d9-52d1-4df7-a3c3-8487be758ad1/[0,0.0.0],odl-infrautils-ready/[7.1.4,7.1.4],odl-jolokia/[11.0.0,11.0.0] 2025-09-06T00:41:00,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Changes to perform: 2025-09-06T00:41:00,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Region: root 2025-09-06T00:41:00,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Bundles to install: 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.transaction/javax.transaction-api/1.2 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-09-06T00:41:00,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-09-06T00:41:00,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.16/jar/uber 2025-09-06T00:41:00,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-09-06T00:41:00,643 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Installing bundles: 2025-09-06T00:41:00,644 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-09-06T00:41:00,646 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-09-06T00:41:00,648 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-09-06T00:41:00,649 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.transaction/javax.transaction-api/1.2 2025-09-06T00:41:00,650 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-09-06T00:41:00,651 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-09-06T00:41:00,652 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-09-06T00:41:00,653 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-09-06T00:41:00,654 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-09-06T00:41:00,655 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.16/jar/uber 2025-09-06T00:41:00,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-09-06T00:41:00,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Starting bundles: 2025-09-06T00:41:00,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.url.wrap/2.6.16 2025-09-06T00:41:00,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.osgi.service.jdbc/1.1.0.202212101352 2025-09-06T00:41:00,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-09-06T00:41:00,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc/1.5.7 2025-09-06T00:41:00,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.el-api/3.0.3 2025-09-06T00:41:00,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-09-06T00:41:00,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.interceptor-api/1.2.2 2025-09-06T00:41:00,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.enterprise.cdi-api/2.0.0.SP1 2025-09-06T00:41:00,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.transaction-api/1.2.0 2025-09-06T00:41:00,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-09-06T00:41:00,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.config/1.5.7 2025-09-06T00:41:00,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Done. 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Changes to perform: 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Region: root 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Bundles to uninstall: 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Bundles to install: 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.checkerframework/checker-qual/3.49.3 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.code.gson/gson/2.13.1 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.guava/guava/33.4.8-jre 2025-09-06T00:41:02,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.guava/failureaccess/1.0.3 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.h2database/h2/2.3.232 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.rabbitmq/amqp-client/5.25.0 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.typesafe/config/1.4.3 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.aeron/aeron-client/1.38.1 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.aeron/aeron-driver/1.38.1 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-core/4.2.32 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.32 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.32 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.32 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.32 2025-09-06T00:41:02,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-buffer/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-base/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-compression/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-http/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-http2/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-common/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-handler/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-resolver/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-classes-epoll/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-native-epoll/4.2.2.Final/jar/linux-x86_64 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-native-unix-common/4.2.2.Final 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-09-06T00:41:02,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.javassist/javassist/3.30.2-GA 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.0 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.lz4/lz4-java/1.8.0 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:net.bytebuddy/byte-buddy/1.17.5 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.agrona/agrona/1.15.2 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-09-06T00:41:02,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-collections/commons-collections/3.2.2 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-codec/commons-codec/1.15 2025-09-06T00:41:02,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.commons/commons-lang3/3.17.0 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.commons/commons-text/1.13.0 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.7 2025-09-06T00:41:02,710 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.7 2025-09-06T00:41:02,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.7 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-osgi/2.14.0 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-scp/2.14.0 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-sftp/2.14.0 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-09-06T00:41:02,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-09-06T00:41:02,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jline/jline/3.21.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jspecify/jspecify/1.0.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm/9.7.1 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-commons/9.7.1 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-tree/9.7.1 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-analysis/9.7.1 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-util/9.7.1 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-cert/0.21.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.0 2025-09-06T00:41:02,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/web-api/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/atomix-storage/11.0.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/blueprint/11.0.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-access-api/11.0.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-access-client/11.0.0 2025-09-06T00:41:02,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-dom-api/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-api/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-journal/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-spi/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-common-util/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.0 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.4 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.4 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.4 2025-09-06T00:41:02,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/ready-api/7.1.4 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/ready-impl/7.1.4 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.4 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.13 2025-09-06T00:41:02,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.13 2025-09-06T00:41:02,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.13 2025-09-06T00:41:02,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.13 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/databind/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/keystore-api/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/keystore-none/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.0 2025-09-06T00:41:02,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-api/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-api/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-nb/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-api/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-http/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-ssh/9.0.0 2025-09-06T00:41:02,721 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-tcp/9.0.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-tls/9.0.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/truststore-api/9.0.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/truststore-none/9.0.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.0 2025-09-06T00:41:02,722 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.0 2025-09-06T00:41:02,723 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.0 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.0 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.0 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.0 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-generator/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-loader/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-model/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-spec/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/concepts/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.14 2025-09-06T00:41:02,724 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.14 2025-09-06T00:41:02,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/util/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-common/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-ir/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.14 2025-09-06T00:41:02,726 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.14 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.url/pax-url-war/2.6.16/jar/uber 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-api/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.30 2025-09-06T00:41:02,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.30 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.30 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.owasp.encoder/encoder/1.3.1 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang.modules/scala-parser-combinators_2.13/1.1.2 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang/scala-library/2.13.16 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang/scala-reflect/2.13.16 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-09-06T00:41:02,728 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Stopping bundles: 2025-09-06T00:41:02,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-09-06T00:41:02,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-09-06T00:41:02,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-09-06T00:41:02,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.transaction-api/1.2.0 2025-09-06T00:41:02,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.enterprise.cdi-api/2.0.0.SP1 2025-09-06T00:41:02,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.el-api/3.0.3 2025-09-06T00:41:02,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.config/1.5.7 2025-09-06T00:41:02,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Uninstalling bundles: 2025-09-06T00:41:02,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-09-06T00:41:02,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Installing bundles: 2025-09-06T00:41:02,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.checkerframework/checker-qual/3.49.3 2025-09-06T00:41:02,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.code.gson/gson/2.13.1 2025-09-06T00:41:02,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.guava/guava/33.4.8-jre 2025-09-06T00:41:02,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.google.guava/failureaccess/1.0.3 2025-09-06T00:41:02,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-09-06T00:41:02,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.h2database/h2/2.3.232 2025-09-06T00:41:02,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.rabbitmq/amqp-client/5.25.0 2025-09-06T00:41:02,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.typesafe/config/1.4.3 2025-09-06T00:41:02,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-09-06T00:41:02,755 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.aeron/aeron-client/1.38.1 2025-09-06T00:41:02,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.aeron/aeron-driver/1.38.1 2025-09-06T00:41:02,758 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-core/4.2.32 2025-09-06T00:41:02,758 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.32 2025-09-06T00:41:02,759 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.32 2025-09-06T00:41:02,760 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.32 2025-09-06T00:41:02,761 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.32 2025-09-06T00:41:02,761 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-buffer/4.2.2.Final 2025-09-06T00:41:02,762 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-base/4.2.2.Final 2025-09-06T00:41:02,763 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-compression/4.2.2.Final 2025-09-06T00:41:02,764 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-http/4.2.2.Final 2025-09-06T00:41:02,766 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-codec-http2/4.2.2.Final 2025-09-06T00:41:02,768 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-common/4.2.2.Final 2025-09-06T00:41:02,769 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-handler/4.2.2.Final 2025-09-06T00:41:02,771 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-resolver/4.2.2.Final 2025-09-06T00:41:02,772 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport/4.2.2.Final 2025-09-06T00:41:02,774 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-classes-epoll/4.2.2.Final 2025-09-06T00:41:02,774 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-native-epoll/4.2.2.Final/jar/linux-x86_64 2025-09-06T00:41:02,776 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:io.netty/netty-transport-native-unix-common/4.2.2.Final 2025-09-06T00:41:02,777 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-09-06T00:41:02,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-09-06T00:41:02,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-09-06T00:41:02,779 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-09-06T00:41:02,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-09-06T00:41:02,781 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.javassist/javassist/3.30.2-GA 2025-09-06T00:41:02,783 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-09-06T00:41:02,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-09-06T00:41:02,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.0 2025-09-06T00:41:02,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.lz4/lz4-java/1.8.0 2025-09-06T00:41:02,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:net.bytebuddy/byte-buddy/1.17.5 2025-09-06T00:41:02,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.agrona/agrona/1.15.2 2025-09-06T00:41:02,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-09-06T00:41:02,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-09-06T00:41:02,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-09-06T00:41:02,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-09-06T00:41:02,804 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-09-06T00:41:02,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-09-06T00:41:02,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-09-06T00:41:02,806 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-09-06T00:41:02,807 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-09-06T00:41:02,808 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-09-06T00:41:02,809 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-09-06T00:41:02,809 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-09-06T00:41:02,838 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-collections/commons-collections/3.2.2 2025-09-06T00:41:02,841 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-09-06T00:41:02,843 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:commons-codec/commons-codec/1.15 2025-09-06T00:41:02,845 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.commons/commons-lang3/3.17.0 2025-09-06T00:41:02,847 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.commons/commons-text/1.13.0 2025-09-06T00:41:02,849 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-09-06T00:41:02,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-09-06T00:41:02,851 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.7 2025-09-06T00:41:02,851 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.7 2025-09-06T00:41:02,853 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.7 2025-09-06T00:41:02,853 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.7 2025-09-06T00:41:02,854 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.7 2025-09-06T00:41:02,855 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.7 2025-09-06T00:41:02,856 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.7 2025-09-06T00:41:02,857 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.7 2025-09-06T00:41:02,858 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.7 2025-09-06T00:41:02,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.7 2025-09-06T00:41:02,860 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.7 2025-09-06T00:41:02,862 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.7 2025-09-06T00:41:02,863 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:02,864 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.7 2025-09-06T00:41:02,865 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.7 2025-09-06T00:41:02,868 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.7 2025-09-06T00:41:02,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.7 2025-09-06T00:41:02,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.7 2025-09-06T00:41:02,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.7 2025-09-06T00:41:02,871 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.7 2025-09-06T00:41:02,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.7 2025-09-06T00:41:02,873 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.7 2025-09-06T00:41:02,874 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.7 2025-09-06T00:41:02,875 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.7 2025-09-06T00:41:02,876 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.7 2025-09-06T00:41:02,877 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.7 2025-09-06T00:41:02,879 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.7 2025-09-06T00:41:02,880 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.7 2025-09-06T00:41:02,881 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.7 2025-09-06T00:41:02,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.7 2025-09-06T00:41:02,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-osgi/2.14.0 2025-09-06T00:41:02,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-scp/2.14.0 2025-09-06T00:41:02,889 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.apache.sshd/sshd-sftp/2.14.0 2025-09-06T00:41:02,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-09-06T00:41:02,896 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-09-06T00:41:02,897 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-09-06T00:41:02,898 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-09-06T00:41:02,900 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-09-06T00:41:02,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-09-06T00:41:02,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-09-06T00:41:02,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-09-06T00:41:02,903 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-09-06T00:41:02,905 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-09-06T00:41:02,906 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-09-06T00:41:02,907 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-09-06T00:41:02,908 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-09-06T00:41:02,909 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-09-06T00:41:02,910 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-09-06T00:41:02,911 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-09-06T00:41:02,912 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-09-06T00:41:02,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-09-06T00:41:02,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-09-06T00:41:02,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-09-06T00:41:02,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-09-06T00:41:02,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-09-06T00:41:02,918 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-09-06T00:41:02,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-09-06T00:41:02,923 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-09-06T00:41:02,924 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-09-06T00:41:02,925 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jline/jline/3.21.0 2025-09-06T00:41:02,927 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-09-06T00:41:02,929 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.jspecify/jspecify/1.0.0 2025-09-06T00:41:02,930 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm/9.7.1 2025-09-06T00:41:02,930 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-commons/9.7.1 2025-09-06T00:41:02,931 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-tree/9.7.1 2025-09-06T00:41:02,932 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-analysis/9.7.1 2025-09-06T00:41:02,934 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ow2.asm/asm-util/9.7.1 2025-09-06T00:41:02,935 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.0 2025-09-06T00:41:02,937 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-cert/0.21.0 2025-09-06T00:41:02,940 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.0 2025-09-06T00:41:02,941 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.0 2025-09-06T00:41:02,942 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.0 2025-09-06T00:41:02,943 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.0 2025-09-06T00:41:02,944 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.0 2025-09-06T00:41:02,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.0 2025-09-06T00:41:02,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.0 2025-09-06T00:41:02,946 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.0 2025-09-06T00:41:02,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.0 2025-09-06T00:41:02,950 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.0 2025-09-06T00:41:02,951 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.0 2025-09-06T00:41:02,952 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/web-api/0.21.0 2025-09-06T00:41:02,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.0 2025-09-06T00:41:02,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.0 2025-09-06T00:41:02,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.0 2025-09-06T00:41:02,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/atomix-storage/11.0.0 2025-09-06T00:41:02,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/blueprint/11.0.0 2025-09-06T00:41:02,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-access-api/11.0.0 2025-09-06T00:41:02,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-access-client/11.0.0 2025-09-06T00:41:02,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-dom-api/11.0.0 2025-09-06T00:41:02,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.0 2025-09-06T00:41:02,961 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.0 2025-09-06T00:41:02,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-api/11.0.0 2025-09-06T00:41:02,963 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-journal/11.0.0 2025-09-06T00:41:02,964 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/raft-spi/11.0.0 2025-09-06T00:41:02,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.0 2025-09-06T00:41:02,988 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.0 2025-09-06T00:41:02,990 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.0 2025-09-06T00:41:02,991 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.0 2025-09-06T00:41:02,992 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.0 2025-09-06T00:41:02,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.0 2025-09-06T00:41:02,995 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-common-util/11.0.0 2025-09-06T00:41:02,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.0 2025-09-06T00:41:02,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.0 2025-09-06T00:41:02,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.4 2025-09-06T00:41:03,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.4 2025-09-06T00:41:03,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.4 2025-09-06T00:41:03,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/ready-api/7.1.4 2025-09-06T00:41:03,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/ready-impl/7.1.4 2025-09-06T00:41:03,003 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.4 2025-09-06T00:41:03,004 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.13 2025-09-06T00:41:03,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.13 2025-09-06T00:41:03,006 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.13 2025-09-06T00:41:03,007 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.13 2025-09-06T00:41:03,007 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.13 2025-09-06T00:41:03,008 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.13 2025-09-06T00:41:03,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.13 2025-09-06T00:41:03,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.13 2025-09-06T00:41:03,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.13 2025-09-06T00:41:03,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.13 2025-09-06T00:41:03,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.13 2025-09-06T00:41:03,014 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.13 2025-09-06T00:41:03,015 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.13 2025-09-06T00:41:03,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.13 2025-09-06T00:41:03,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.13 2025-09-06T00:41:03,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.13 2025-09-06T00:41:03,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.13 2025-09-06T00:41:03,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.13 2025-09-06T00:41:03,021 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.13 2025-09-06T00:41:03,022 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.13 2025-09-06T00:41:03,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.13 2025-09-06T00:41:03,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.13 2025-09-06T00:41:03,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.13 2025-09-06T00:41:03,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.13 2025-09-06T00:41:03,028 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.13 2025-09-06T00:41:03,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.13 2025-09-06T00:41:03,030 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.13 2025-09-06T00:41:03,031 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.13 2025-09-06T00:41:03,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.13 2025-09-06T00:41:03,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.13 2025-09-06T00:41:03,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.13 2025-09-06T00:41:03,036 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.13 2025-09-06T00:41:03,038 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.13 2025-09-06T00:41:03,039 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.13 2025-09-06T00:41:03,040 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.13 2025-09-06T00:41:03,040 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.13 2025-09-06T00:41:03,042 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.13 2025-09-06T00:41:03,043 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.13 2025-09-06T00:41:03,044 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.13 2025-09-06T00:41:03,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.13 2025-09-06T00:41:03,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.13 2025-09-06T00:41:03,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.13 2025-09-06T00:41:03,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.13 2025-09-06T00:41:03,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.13 2025-09-06T00:41:03,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.13 2025-09-06T00:41:03,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.13 2025-09-06T00:41:03,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.13 2025-09-06T00:41:03,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.13 2025-09-06T00:41:03,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.13 2025-09-06T00:41:03,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.13 2025-09-06T00:41:03,055 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.13 2025-09-06T00:41:03,056 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.13 2025-09-06T00:41:03,056 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.13 2025-09-06T00:41:03,057 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.13 2025-09-06T00:41:03,058 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.13 2025-09-06T00:41:03,058 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.13 2025-09-06T00:41:03,059 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.13 2025-09-06T00:41:03,060 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.13 2025-09-06T00:41:03,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.13 2025-09-06T00:41:03,062 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/databind/9.0.0 2025-09-06T00:41:03,062 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.0 2025-09-06T00:41:03,063 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/keystore-api/9.0.0 2025-09-06T00:41:03,064 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/keystore-none/9.0.0 2025-09-06T00:41:03,064 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.0 2025-09-06T00:41:03,066 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.0 2025-09-06T00:41:03,067 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.0 2025-09-06T00:41:03,068 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-api/9.0.0 2025-09-06T00:41:03,069 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.0 2025-09-06T00:41:03,069 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.0 2025-09-06T00:41:03,070 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-api/9.0.0 2025-09-06T00:41:03,071 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.0 2025-09-06T00:41:03,072 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-nb/9.0.0 2025-09-06T00:41:03,073 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server/9.0.0 2025-09-06T00:41:03,074 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.0 2025-09-06T00:41:03,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.0 2025-09-06T00:41:03,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.0 2025-09-06T00:41:03,077 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.0 2025-09-06T00:41:03,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.0 2025-09-06T00:41:03,079 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.0 2025-09-06T00:41:03,080 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.0 2025-09-06T00:41:03,103 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-api/9.0.0 2025-09-06T00:41:03,104 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-http/9.0.0 2025-09-06T00:41:03,109 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-ssh/9.0.0 2025-09-06T00:41:03,110 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-tcp/9.0.0 2025-09-06T00:41:03,111 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/transport-tls/9.0.0 2025-09-06T00:41:03,112 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/truststore-api/9.0.0 2025-09-06T00:41:03,112 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/truststore-none/9.0.0 2025-09-06T00:41:03,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.0 2025-09-06T00:41:03,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.0 2025-09-06T00:41:03,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.0 2025-09-06T00:41:03,118 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.0 2025-09-06T00:41:03,119 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.0 2025-09-06T00:41:03,120 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.0 2025-09-06T00:41:03,121 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.0 2025-09-06T00:41:03,122 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.0 2025-09-06T00:41:03,123 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.0 2025-09-06T00:41:03,124 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.0 2025-09-06T00:41:03,125 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.0 2025-09-06T00:41:03,126 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.0 2025-09-06T00:41:03,127 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.0 2025-09-06T00:41:03,128 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.0 2025-09-06T00:41:03,129 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.0 2025-09-06T00:41:03,129 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.0 2025-09-06T00:41:03,130 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.0 2025-09-06T00:41:03,133 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.0 2025-09-06T00:41:03,134 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.0 2025-09-06T00:41:03,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.0 2025-09-06T00:41:03,140 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.0 2025-09-06T00:41:03,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.0 2025-09-06T00:41:03,151 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.0 2025-09-06T00:41:03,154 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.0 2025-09-06T00:41:03,155 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.0 2025-09-06T00:41:03,156 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.0 2025-09-06T00:41:03,157 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.0 2025-09-06T00:41:03,165 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.0 2025-09-06T00:41:03,169 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.0 2025-09-06T00:41:03,170 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.0 2025-09-06T00:41:03,171 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.0 2025-09-06T00:41:03,172 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.0 2025-09-06T00:41:03,173 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.0 2025-09-06T00:41:03,174 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.14 2025-09-06T00:41:03,174 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.14 2025-09-06T00:41:03,176 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.14 2025-09-06T00:41:03,176 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.14 2025-09-06T00:41:03,177 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-generator/14.0.14 2025-09-06T00:41:03,178 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-loader/14.0.14 2025-09-06T00:41:03,179 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-model/14.0.14 2025-09-06T00:41:03,180 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.14 2025-09-06T00:41:03,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.14 2025-09-06T00:41:03,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.14 2025-09-06T00:41:03,182 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.14 2025-09-06T00:41:03,183 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/binding-spec/14.0.14 2025-09-06T00:41:03,184 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.14 2025-09-06T00:41:03,185 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/concepts/14.0.14 2025-09-06T00:41:03,185 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.14 2025-09-06T00:41:03,186 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.14 2025-09-06T00:41:03,187 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.14 2025-09-06T00:41:03,188 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.14 2025-09-06T00:41:03,188 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.14 2025-09-06T00:41:03,189 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.14 2025-09-06T00:41:03,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.14 2025-09-06T00:41:03,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.14 2025-09-06T00:41:03,192 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.14 2025-09-06T00:41:03,192 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.14 2025-09-06T00:41:03,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.14 2025-09-06T00:41:03,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.14 2025-09-06T00:41:03,195 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.14 2025-09-06T00:41:03,195 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.14 2025-09-06T00:41:03,196 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.14 2025-09-06T00:41:03,197 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.14 2025-09-06T00:41:03,197 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.14 2025-09-06T00:41:03,198 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.14 2025-09-06T00:41:03,198 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.14 2025-09-06T00:41:03,199 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.14 2025-09-06T00:41:03,200 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/util/14.0.14 2025-09-06T00:41:03,200 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-common/14.0.14 2025-09-06T00:41:03,201 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.14 2025-09-06T00:41:03,202 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.14 2025-09-06T00:41:03,203 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.14 2025-09-06T00:41:03,204 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.14 2025-09-06T00:41:03,205 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.14 2025-09-06T00:41:03,206 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.14 2025-09-06T00:41:03,207 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.14 2025-09-06T00:41:03,208 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.14 2025-09-06T00:41:03,208 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.14 2025-09-06T00:41:03,209 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.14 2025-09-06T00:41:03,210 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.14 2025-09-06T00:41:03,211 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.14 2025-09-06T00:41:03,212 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-ir/14.0.14 2025-09-06T00:41:03,213 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.14 2025-09-06T00:41:03,214 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.14 2025-09-06T00:41:03,214 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.14 2025-09-06T00:41:03,216 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.14 2025-09-06T00:41:03,217 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.14 2025-09-06T00:41:03,217 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.14 2025-09-06T00:41:03,218 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.14 2025-09-06T00:41:03,219 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.14 2025-09-06T00:41:03,220 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.14 2025-09-06T00:41:03,221 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.14 2025-09-06T00:41:03,222 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.14 2025-09-06T00:41:03,223 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.14 2025-09-06T00:41:03,223 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.14 2025-09-06T00:41:03,224 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.14 2025-09-06T00:41:03,225 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.14 2025-09-06T00:41:03,225 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.url/pax-url-war/2.6.16/jar/uber 2025-09-06T00:41:03,228 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-api/8.0.30 2025-09-06T00:41:03,230 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.30 2025-09-06T00:41:03,230 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.30 2025-09-06T00:41:03,231 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.30 2025-09-06T00:41:03,232 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.30 2025-09-06T00:41:03,233 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.30 2025-09-06T00:41:03,233 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.30 2025-09-06T00:41:03,235 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.30 2025-09-06T00:41:03,236 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.30 2025-09-06T00:41:03,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.30 2025-09-06T00:41:03,239 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.30 2025-09-06T00:41:03,240 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-09-06T00:41:03,240 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.owasp.encoder/encoder/1.3.1 2025-09-06T00:41:03,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang.modules/scala-parser-combinators_2.13/1.1.2 2025-09-06T00:41:03,242 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang/scala-library/2.13.16 2025-09-06T00:41:03,249 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.scala-lang/scala-reflect/2.13.16 2025-09-06T00:41:03,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-09-06T00:41:03,259 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-09-06T00:41:03,260 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-09-06T00:41:03,927 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-09-06T00:41:03,929 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-09-06T00:41:03,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-password-service-config.xml 2025-09-06T00:41:03,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/configuration/factory/pekko.conf 2025-09-06T00:41:03,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-09-06T00:41:03,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0//etc/org.jolokia.osgi.cfg 2025-09-06T00:41:03,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-09-06T00:41:03,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/default-openflow-connection-config.xml 2025-09-06T00:41:03,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/legacy-openflow-connection-config.xml 2025-09-06T00:41:03,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-cert-config.xml 2025-09-06T00:41:03,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/jetty-web.xml 2025-09-06T00:41:03,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-09-06T00:41:03,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-app-config.xml 2025-09-06T00:41:03,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-datastore-config.xml 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0/bin/idmtool 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Creating configuration file /tmp/karaf-0.23.0//etc/org.opendaylight.aaa.filterchain.cfg 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Refreshing bundles: 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.el-api/3.0.3 (Attached fragments changed: [org.ops4j.pax.web.pax-web-compatibility-el2/8.0.30]) 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.enterprise.cdi-api/2.0.0.SP1 (Wired to javax.el-api/3.0.3 which is being refreshed) 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.transaction-api/1.2.0 (Wired to javax.enterprise.cdi-api/2.0.0.SP1 which is being refreshed) 2025-09-06T00:41:03,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.jasypt/1.9.3.1 (Should be wired to: jakarta.servlet-api/4.0.0 (through [org.apache.servicemix.bundles.jasypt/1.9.3.1] osgi.wiring.package; resolution:=optional; filter:="(osgi.wiring.package=javax.servlet)")) 2025-09-06T00:41:03,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 (Bundle will be uninstalled) 2025-09-06T00:41:03,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.config/1.5.7 (Wired to org.apache.servicemix.bundles.jasypt/1.9.3.1 which is being refreshed) 2025-09-06T00:41:03,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.jdbc.pool.common/1.5.7 (Wired to javax.transaction-api/1.2.0 which is being refreshed) 2025-09-06T00:41:04,615 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Starting bundles: 2025-09-06T00:41:04,616 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.quiesce.api/1.0.0 2025-09-06T00:41:04,619 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.blueprint.api/1.0.1 2025-09-06T00:41:04,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.objectweb.asm/9.7.1 2025-09-06T00:41:04,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.objectweb.asm.tree/9.7.1 2025-09-06T00:41:04,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.objectweb.asm.commons/9.7.1 2025-09-06T00:41:04,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.proxy/1.1.14 2025-09-06T00:41:04,624 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.blueprint.core/1.10.3 2025-09-06T00:41:04,759 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-09-06T00:41:04,764 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.blueprint.cm/1.3.2 2025-09-06T00:41:04,782 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-09-06T00:41:04,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.objectweb.asm.tree.analysis/9.7.1 2025-09-06T00:41:04,783 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.objectweb.asm.util/9.7.1 2025-09-06T00:41:04,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.deployer.kar/4.4.7 2025-09-06T00:41:04,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.deployer.blueprint/4.4.7 2025-09-06T00:41:04,792 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.restconf.nb.rfc8040} from /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-09-06T00:41:04,795 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.jolokia.osgi} from /tmp/karaf-0.23.0/etc/org.jolokia.osgi.cfg 2025-09-06T00:41:04,797 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.controller.cluster.datastore} from /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-09-06T00:41:04,798 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.openflowplugin} from /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-09-06T00:41:04,800 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.aaa.filterchain} from /tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg 2025-09-06T00:41:04,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.deployer.wrap/4.4.7 2025-09-06T00:41:04,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.deployer.features/4.4.7 2025-09-06T00:41:04,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.util/9.4.57.v20241219 2025-09-06T00:41:04,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.jmx/9.4.57.v20241219 2025-09-06T00:41:04,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.io/9.4.57.v20241219 2025-09-06T00:41:04,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.http/9.4.57.v20241219 2025-09-06T00:41:04,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.sshd.osgi/2.14.0 2025-09-06T00:41:04,831 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.sshd.scp/2.14.0 2025-09-06T00:41:04,831 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.sshd.sftp/2.14.0 2025-09-06T00:41:04,832 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.jline/3.21.0 2025-09-06T00:41:04,832 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.shell.core/4.4.7 2025-09-06T00:41:04,857 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.7 2025-09-06T00:41:04,859 | INFO | features-3-thread-1 | Activator | 120 - org.apache.karaf.shell.core - 4.4.7 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-09-06T00:41:04,882 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.7 has been started 2025-09-06T00:41:04,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.diagnostic.core/4.4.7 2025-09-06T00:41:04,893 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.7 2025-09-06T00:41:04,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | jakarta.servlet-api/4.0.0 2025-09-06T00:41:04,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-api/8.0.30 2025-09-06T00:41:04,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.websocket-api/1.1.2 2025-09-06T00:41:04,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-spi/8.0.30 2025-09-06T00:41:04,896 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-extender-whiteboard/8.0.30 2025-09-06T00:41:04,897 | INFO | features-3-thread-1 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.30 | Starting Pax Web Whiteboard Extender 2025-09-06T00:41:04,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.jmx.api/1.1.5 2025-09-06T00:41:04,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.jmx.core/1.1.8 2025-09-06T00:41:04,922 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-09-06T00:41:04,930 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f] for service with service.id [15] 2025-09-06T00:41:04,931 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f] for service with service.id [39] 2025-09-06T00:41:04,933 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.client/9.4.57.v20241219 2025-09-06T00:41:05,037 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.osgi.service.component/1.5.1.202212101352 2025-09-06T00:41:05,037 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.continuation/9.4.57.v20241219 2025-09-06T00:41:05,038 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.features.command/4.4.7 2025-09-06T00:41:05,050 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.features.command/4.4.7 2025-09-06T00:41:05,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.kar.core/4.4.7 2025-09-06T00:41:05,057 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.kar.core/4.4.7. Missing service: [org.apache.karaf.kar.KarService] 2025-09-06T00:41:05,057 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.jaas.config/4.4.7 2025-09-06T00:41:05,060 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.kar.core/4.4.7 2025-09-06T00:41:05,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.config.command/4.4.7 2025-09-06T00:41:05,068 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.config.command/4.4.7 2025-09-06T00:41:05,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.server/9.4.57.v20241219 2025-09-06T00:41:05,116 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.security/9.4.57.v20241219 2025-09-06T00:41:05,116 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.url.war/2.6.16 2025-09-06T00:41:05,120 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.el-api/3.0.3 2025-09-06T00:41:05,121 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jdt.core.compiler.batch/3.26.0.v20210609-0549 2025-09-06T00:41:05,122 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-jsp/8.0.30 2025-09-06T00:41:05,122 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-tomcat-common/8.0.30 2025-09-06T00:41:05,123 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.util.ajax/9.4.57.v20241219 2025-09-06T00:41:05,123 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.servlet/9.4.57.v20241219 2025-09-06T00:41:05,123 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.xml/9.4.57.v20241219 2025-09-06T00:41:05,124 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.jaas/9.4.57.v20241219 2025-09-06T00:41:05,124 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.eclipse.jetty.servlets/9.4.57.v20241219 2025-09-06T00:41:05,124 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-jetty/8.0.30 2025-09-06T00:41:05,135 | INFO | features-3-thread-1 | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @8222ms to org.eclipse.jetty.util.log.Slf4jLog 2025-09-06T00:41:05,142 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-runtime/8.0.30 2025-09-06T00:41:05,152 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Scheduling Pax Web reconfiguration because configuration has changed 2025-09-06T00:41:05,153 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-09-06T00:41:05,153 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Pax Web Runtime started 2025-09-06T00:41:05,154 | INFO | paxweb-config-1-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-09-06T00:41:05,158 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.http.core/4.4.7 2025-09-06T00:41:05,168 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.7. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-09-06T00:41:05,168 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.bundle.core/4.4.7 2025-09-06T00:41:05,185 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.7 2025-09-06T00:41:05,185 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-websocket/8.0.30 2025-09-06T00:41:05,186 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.log.core/4.4.7 2025-09-06T00:41:05,196 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.log.core/4.4.7. Missing service: [org.apache.karaf.log.core.LogService] 2025-09-06T00:41:05,196 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.management.server/4.4.7 2025-09-06T00:41:05,203 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.felix.scr/2.2.6 2025-09-06T00:41:05,202 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.log.core/4.4.7 2025-09-06T00:41:05,203 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.7 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-09-06T00:41:05,215 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-09-06T00:41:05,219 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-09-06T00:41:05,229 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.scr.management/4.4.7 2025-09-06T00:41:05,230 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-09-06T00:41:05,233 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Configuring JettyServerController{configuration=5ec41036-b58b-40f4-b20b-04579b573d1e,state=UNCONFIGURED} 2025-09-06T00:41:05,233 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating Jetty server instance using configuration properties. 2025-09-06T00:41:05,259 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.jmx.blueprint.api/1.2.0 2025-09-06T00:41:05,268 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.shell.table/4.4.7 2025-09-06T00:41:05,269 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.shell.commands/4.4.7 2025-09-06T00:41:05,271 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-09-06T00:41:05,282 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.7 2025-09-06T00:41:05,282 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.7 2025-09-06T00:41:05,283 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.service.core/4.4.7 2025-09-06T00:41:05,289 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.service.core/4.4.7 2025-09-06T00:41:05,289 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.jaas.modules/4.4.7 2025-09-06T00:41:05,294 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.shell.ssh/4.4.7 2025-09-06T00:41:05,337 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.7. Missing service: [org.apache.sshd.server.SshServer] 2025-09-06T00:41:05,338 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.jmx.whiteboard/1.2.0 2025-09-06T00:41:05,354 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.ops4j.pax.web.pax-web-extender-war/8.0.30 2025-09-06T00:41:05,356 | INFO | features-3-thread-1 | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.30 | Configuring WAR extender thread pool. Pool size = 3 2025-09-06T00:41:05,373 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,374 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,375 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,375 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,380 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,390 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,390 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@fc3d0c2 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=5c8e7801-1ebc-4977-9c18-d37b25e53f9f 2025-09-06T00:41:05,448 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.7 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-09-06T00:41:05,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.instance.core/4.4.7 2025-09-06T00:41:05,629 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.instance.core/4.4.7 2025-09-06T00:41:05,630 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.package.core/4.4.7 2025-09-06T00:41:05,638 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.package.core/4.4.7 2025-09-06T00:41:05,638 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:05,645 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:05,646 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:05,647 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T00:41:05,647 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.web.core/4.4.7 2025-09-06T00:41:05,656 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.7. Missing service: [org.apache.karaf.web.WebContainerService] 2025-09-06T00:41:05,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.jmx.blueprint.core/1.2.0 2025-09-06T00:41:05,660 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.7 2025-09-06T00:41:05,668 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-09-06T00:41:05,670 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Using configured jetty-default@2e57b1d0{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-09-06T00:41:05,672 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp674804643]@2838b3a3{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-09-06T00:41:05,674 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.14.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-09-06T00:41:05,681 | INFO | paxweb-config-1-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding JMX support to Jetty server 2025-09-06T00:41:05,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.scr.state/4.4.7 2025-09-06T00:41:05,705 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-09-06T00:41:05,706 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting JettyServerController{configuration=5ec41036-b58b-40f4-b20b-04579b573d1e,state=STOPPED} 2025-09-06T00:41:05,706 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Server@14f2f82d{STOPPED}[9.4.57.v20241219] 2025-09-06T00:41:05,707 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.5+11-Ubuntu-1ubuntu122.04 2025-09-06T00:41:05,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.system.core/4.4.7 2025-09-06T00:41:05,721 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-09-06T00:41:05,721 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-09-06T00:41:05,723 | INFO | paxweb-config-1-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 660000ms 2025-09-06T00:41:05,727 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.system.core/4.4.7 2025-09-06T00:41:05,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.bundle.blueprintstate/4.4.7 2025-09-06T00:41:05,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.blueprint/11.0.0 2025-09-06T00:41:05,743 | INFO | features-3-thread-1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Starting BlueprintBundleTracker 2025-09-06T00:41:05,754 | INFO | paxweb-config-1-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@2e57b1d0{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-09-06T00:41:05,754 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-09-06T00:41:05,755 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.7 [120] was successfully created 2025-09-06T00:41:05,755 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @8844ms 2025-09-06T00:41:05,755 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-09-06T00:41:05,758 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering HttpService factory 2025-09-06T00:41:05,764 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.7 [105]] 2025-09-06T00:41:05,771 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.30 [393]] 2025-09-06T00:41:05,783 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.http.core/4.4.7 2025-09-06T00:41:05,788 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.7 [124]] 2025-09-06T00:41:05,788 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.30 [392]] 2025-09-06T00:41:05,796 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering HttpServiceRuntime 2025-09-06T00:41:05,804 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-09-06T00:41:05,805 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=2} 2025-09-06T00:41:05,805 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-09-06T00:41:05,812 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.web.core/4.4.7 2025-09-06T00:41:05,840 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@3e24bac0{/,null,STOPPED} 2025-09-06T00:41:05,844 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@3e24bac0{/,null,STOPPED} 2025-09-06T00:41:05,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.google.guava.failureaccess/1.0.3 2025-09-06T00:41:05,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | jakarta.annotation-api/1.3.5 2025-09-06T00:41:06,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.google.guava/33.4.8.jre 2025-09-06T00:41:06,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.model.ietf-type-util/14.0.13 2025-09-06T00:41:06,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.concepts/14.0.14 2025-09-06T00:41:06,015 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-common/14.0.14 2025-09-06T00:41:06,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-spec/14.0.14 2025-09-06T00:41:06,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-reflect/14.0.14 2025-09-06T00:41:06,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-yang-types/14.0.13 2025-09-06T00:41:06,021 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8341/14.0.13 2025-09-06T00:41:06,021 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9640/14.0.13 2025-09-06T00:41:06,022 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9642/14.0.13 2025-09-06T00:41:06,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.tls-cipher-suite-algs/14.0.13 2025-09-06T00:41:06,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-common/14.0.13 2025-09-06T00:41:06,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9641/14.0.13 2025-09-06T00:41:06,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-client/14.0.13 2025-09-06T00:41:06,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.aries.util/1.1.3 2025-09-06T00:41:06,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.ssh-mac-algs/14.0.13 2025-09-06T00:41:06,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.resolver/4.2.2.Final 2025-09-06T00:41:06,027 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.transport/4.2.2.Final 2025-09-06T00:41:06,028 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.codec-base/4.2.2.Final 2025-09-06T00:41:06,028 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | lz4-java/1.8.0 2025-09-06T00:41:06,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.codec-compression/4.2.2.Final 2025-09-06T00:41:06,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.transport-native-unix-common/4.2.2.Final 2025-09-06T00:41:06,031 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.handler/4.2.2.Final 2025-09-06T00:41:06,032 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.codec-http/4.2.2.Final 2025-09-06T00:41:06,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.codec-http2/4.2.2.Final 2025-09-06T00:41:06,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.commons.commons-codec/1.15.0 2025-09-06T00:41:06,034 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.transport-classes-epoll/4.2.2.Final 2025-09-06T00:41:06,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.transport-api/9.0.0 2025-09-06T00:41:06,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-inet-types/14.0.13 2025-09-06T00:41:06,205 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-common/14.0.13 2025-09-06T00:41:06,207 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-client/14.0.13 2025-09-06T00:41:06,208 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-server/14.0.13 2025-09-06T00:41:06,209 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.transport-tcp/9.0.0 2025-09-06T00:41:06,213 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-server/14.0.13 2025-09-06T00:41:06,213 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.transport-tls/9.0.0 2025-09-06T00:41:06,214 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.crypt-hash/14.0.13 2025-09-06T00:41:06,214 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.transport-http/9.0.0 2025-09-06T00:41:06,215 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc6241/14.0.13 2025-09-06T00:41:06,215 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.shaded-sshd/9.0.0 2025-09-06T00:41:06,216 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-xpath-api/14.0.14 2025-09-06T00:41:06,217 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-model-api/14.0.14 2025-09-06T00:41:06,217 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc7952-model-api/14.0.14 2025-09-06T00:41:06,218 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | triemap/1.3.2 2025-09-06T00:41:06,219 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.util/14.0.14 2025-09-06T00:41:06,219 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-ir/14.0.14 2025-09-06T00:41:06,220 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-model-spi/14.0.14 2025-09-06T00:41:06,221 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javax.servlet-api/3.1.0 2025-09-06T00:41:06,221 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.commons.collections/3.2.2 2025-09-06T00:41:06,222 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.commons.commons-beanutils/1.11.0 2025-09-06T00:41:06,222 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.owasp.encoder/1.3.1 2025-09-06T00:41:06,223 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.repackaged-shiro/0.21.0 2025-09-06T00:41:06,224 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.dropwizard.metrics.core/4.2.32 2025-09-06T00:41:06,225 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.google.gson/2.13.1 2025-09-06T00:41:06,226 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.typesafe.config/1.4.3 2025-09-06T00:41:06,226 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap_file__tmp_karaf-0.23.0_system_net_java_dev_stax-utils_stax-utils_20070216_stax-utils-20070216.jar/0.0.0 2025-09-06T00:41:06,227 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.commons.lang3/3.17.0 2025-09-06T00:41:06,227 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.commons.text/1.13.0 2025-09-06T00:41:06,228 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.scala-lang.scala-library/2.13.16.v20250107-233423-VFINAL-3f6bdae 2025-09-06T00:41:06,229 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.typesafe.sslconfig/0.6.1 2025-09-06T00:41:06,229 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.agrona.core/1.15.2 2025-09-06T00:41:06,230 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.aeron.client/1.38.1 2025-09-06T00:41:06,230 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.aeron.driver/1.38.1 2025-09-06T00:41:06,231 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | wrap_file__tmp_karaf-0.23.0_system_org_lmdbjava_lmdbjava_0.7.0_lmdbjava-0.7.0.jar/0.0.0 2025-09-06T00:41:06,231 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | reactive-streams/1.0.4 2025-09-06T00:41:06,231 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.repackaged-pekko/11.0.0 2025-09-06T00:41:06,234 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | checker-qual/3.49.3 2025-09-06T00:41:06,235 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.dropwizard.metrics.jmx/4.2.32 2025-09-06T00:41:06,236 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.geronimo.specs.geronimo-atinject_1.0_spec/1.2.0 2025-09-06T00:41:06,236 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.raft-api/11.0.0 2025-09-06T00:41:06,237 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.raft-spi/11.0.0 2025-09-06T00:41:06,237 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-api/14.0.14 2025-09-06T00:41:06,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-spi/14.0.14 2025-09-06T00:41:06,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8528-model-api/14.0.14 2025-09-06T00:41:06,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8040-model-api/14.0.14 2025-09-06T00:41:06,239 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-model-util/14.0.14 2025-09-06T00:41:06,239 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-util/14.0.14 2025-09-06T00:41:06,240 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-impl/14.0.14 2025-09-06T00:41:06,240 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-tree-api/14.0.14 2025-09-06T00:41:06,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-tree-spi/14.0.14 2025-09-06T00:41:06,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-codec-binfmt/14.0.14 2025-09-06T00:41:06,241 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-repo-api/14.0.14 2025-09-06T00:41:06,242 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-repo-spi/14.0.14 2025-09-06T00:41:06,242 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-common-api/14.0.13 2025-09-06T00:41:06,243 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-dom-api/14.0.13 2025-09-06T00:41:06,243 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.netconf-common-mdsal/9.0.0 2025-09-06T00:41:06,244 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-codec-gson/14.0.14 2025-09-06T00:41:06,244 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | stax2-api/4.2.2 2025-09-06T00:41:06,245 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-codec-xml/14.0.14 2025-09-06T00:41:06,245 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.databind/9.0.0 2025-09-06T00:41:06,245 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.netconf-api/9.0.0 2025-09-06T00:41:06,246 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.dom-api/9.0.0 2025-09-06T00:41:06,246 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc6243/14.0.13 2025-09-06T00:41:06,247 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-api/9.0.0 2025-09-06T00:41:06,247 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.odlext-model-api/14.0.14 2025-09-06T00:41:06,247 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-dom-spi/14.0.13 2025-09-06T00:41:06,248 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf/14.0.13 2025-09-06T00:41:06,248 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8072/14.0.13 2025-09-06T00:41:06,249 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-server-api/9.0.0 2025-09-06T00:41:06,249 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8343/14.0.13 2025-09-06T00:41:06,250 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8344/14.0.13 2025-09-06T00:41:06,250 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8528/14.0.13 2025-09-06T00:41:06,251 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8529/14.0.13 2025-09-06T00:41:06,252 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8639/14.0.13 2025-09-06T00:41:06,252 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-model-export/14.0.14 2025-09-06T00:41:06,253 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-server-spi/9.0.0 2025-09-06T00:41:06,253 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-mdsal-spi/9.0.0 2025-09-06T00:41:06,254 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf-monitoring/14.0.13 2025-09-06T00:41:06,254 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8650/14.0.13 2025-09-06T00:41:06,255 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-model/14.0.14 2025-09-06T00:41:06,255 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-runtime-api/14.0.14 2025-09-06T00:41:06,255 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-parser-api/14.0.14 2025-09-06T00:41:06,256 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-parser-spi/14.0.14 2025-09-06T00:41:06,256 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.odlext-parser-support/14.0.14 2025-09-06T00:41:06,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.openconfig-model-api/14.0.14 2025-09-06T00:41:06,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.openconfig-parser-support/14.0.14 2025-09-06T00:41:06,257 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6241-model-api/14.0.14 2025-09-06T00:41:06,258 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6241-parser-support/14.0.14 2025-09-06T00:41:06,258 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6536-model-api/14.0.14 2025-09-06T00:41:06,259 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6536-parser-support/14.0.14 2025-09-06T00:41:06,259 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6643-model-api/14.0.14 2025-09-06T00:41:06,259 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc6643-parser-support/14.0.14 2025-09-06T00:41:06,260 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-model-ri/14.0.14 2025-09-06T00:41:06,260 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc7952-parser-support/14.0.14 2025-09-06T00:41:06,261 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8040-parser-support/14.0.14 2025-09-06T00:41:06,261 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8528-parser-support/14.0.14 2025-09-06T00:41:06,262 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8639-model-api/14.0.14 2025-09-06T00:41:06,262 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8639-parser-support/14.0.14 2025-09-06T00:41:06,262 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8819-model-api/14.0.14 2025-09-06T00:41:06,263 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.rfc8819-parser-support/14.0.14 2025-09-06T00:41:06,263 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.antlr.antlr4-runtime/4.13.2 2025-09-06T00:41:06,264 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-parser-reactor/14.0.14 2025-09-06T00:41:06,264 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-parser-rfc7950/14.0.14 2025-09-06T00:41:06,265 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-xpath-impl/14.0.14 2025-09-06T00:41:06,269 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-parser-impl/14.0.14 2025-09-06T00:41:06,274 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-runtime-spi/14.0.14 2025-09-06T00:41:06,276 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-generator/14.0.14 2025-09-06T00:41:06,283 | INFO | features-3-thread-1 | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.14 | Binding/YANG type support activated 2025-09-06T00:41:06,283 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-runtime-osgi/14.0.14 2025-09-06T00:41:06,294 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Binding Runtime activating 2025-09-06T00:41:06,296 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Binding Runtime activated 2025-09-06T00:41:06,300 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Model Runtime starting 2025-09-06T00:41:06,339 | INFO | features-3-thread-1 | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Will attempt to integrate with Karaf FeaturesService 2025-09-06T00:41:07,038 | INFO | features-3-thread-1 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.0 | Netty transport backed by epoll(2) 2025-09-06T00:41:07,329 | INFO | features-3-thread-1 | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.14 | Using weak references 2025-09-06T00:41:09,603 | INFO | features-3-thread-1 | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | EffectiveModelContext generation 1 activated 2025-09-06T00:41:10,345 | INFO | features-3-thread-1 | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | BindingRuntimeContext generation 1 activated 2025-09-06T00:41:10,345 | INFO | features-3-thread-1 | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Global BindingRuntimeContext generation 1 activated 2025-09-06T00:41:10,346 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Model Runtime started 2025-09-06T00:41:10,346 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-dom-schema-osgi/14.0.13 2025-09-06T00:41:10,353 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.13 | DOM Schema services activated 2025-09-06T00:41:10,354 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.13 | Updating context to generation 1 2025-09-06T00:41:10,354 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-dom-broker/14.0.13 2025-09-06T00:41:10,366 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.model.yang-ext/2013.9.7.26_13 2025-09-06T00:41:10,368 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.model.inventory/0.20.0 2025-09-06T00:41:10,368 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.model.opendaylight-l2-types/2013.8.27.26_13 2025-09-06T00:41:10,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.model.flow-base/0.20.0 2025-09-06T00:41:10,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-common-util/11.0.0 2025-09-06T00:41:10,370 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-remoterpc-connector/11.0.0 2025-09-06T00:41:10,372 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.ready-api/7.1.4 2025-09-06T00:41:10,372 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.codegen-extensions/14.0.14 2025-09-06T00:41:10,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.model.flow-service/0.20.0 2025-09-06T00:41:10,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.model.flow-statistics/0.20.0 2025-09-06T00:41:10,374 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.core.jersey-client/2.47.0 2025-09-06T00:41:10,375 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | jakarta.validation.jakarta.validation-api/2.0.2 2025-09-06T00:41:10,375 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.core.jersey-server/2.47.0 2025-09-06T00:41:10,376 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.media.jersey-media-sse/2.47.0 2025-09-06T00:41:10,377 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-binding-api/14.0.13 2025-09-06T00:41:10,378 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-eos-common-api/14.0.13 2025-09-06T00:41:10,378 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.model.general-entity/14.0.13 2025-09-06T00:41:10,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-eos-binding-api/14.0.13 2025-09-06T00:41:10,380 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-singleton-api/14.0.13 2025-09-06T00:41:10,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.api/0.20.0 2025-09-06T00:41:10,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.diagstatus-api/7.1.4 2025-09-06T00:41:10,382 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.util/7.1.4 2025-09-06T00:41:10,383 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-binding-spi/14.0.13 2025-09-06T00:41:10,388 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-spi/0.20.0 2025-09-06T00:41:10,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.openflowjava.util/0.20.0 2025-09-06T00:41:10,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-common-netty/14.0.14 2025-09-06T00:41:10,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.odlparent.bundles-diag/14.1.0 2025-09-06T00:41:10,392 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.ready-impl/7.1.4 2025-09-06T00:41:10,405 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | ThreadFactory created: SystemReadyService 2025-09-06T00:41:10,406 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-09-06T00:41:10,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.diagstatus-impl/7.1.4 2025-09-06T00:41:10,407 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | checkBundleDiagInfos() started... 2025-09-06T00:41:10,415 | INFO | features-3-thread-1 | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.4 | Diagnostic Status Service started 2025-09-06T00:41:10,419 | INFO | features-3-thread-1 | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.4 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-09-06T00:41:10,419 | INFO | features-3-thread-1 | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.4 | Diagnostic Status Service management started 2025-09-06T00:41:10,419 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl/0.20.0 2025-09-06T00:41:10,423 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.common/0.20.0 2025-09-06T00:41:10,428 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.extension-api/0.20.0 2025-09-06T00:41:10,428 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-data-codec-api/14.0.14 2025-09-06T00:41:10,429 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-data-codec-spi/14.0.14 2025-09-06T00:41:10,429 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | net.bytebuddy.byte-buddy/1.17.5 2025-09-06T00:41:10,431 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-loader/14.0.14 2025-09-06T00:41:10,431 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-data-codec-dynamic/14.0.14 2025-09-06T00:41:10,435 | INFO | features-3-thread-1 | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.14 | Binding/DOM Codec enabled 2025-09-06T00:41:10,435 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.binding-data-codec-osgi/14.0.14 2025-09-06T00:41:10,440 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec activating 2025-09-06T00:41:10,460 | INFO | features-3-thread-1 | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec generation 1 activated 2025-09-06T00:41:10,460 | INFO | features-3-thread-1 | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Global Binding/DOM Codec activated with generation 1 2025-09-06T00:41:10,462 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec activated 2025-09-06T00:41:10,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding-dom-adapter/14.0.13 2025-09-06T00:41:10,475 | INFO | features-3-thread-1 | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter activated 2025-09-06T00:41:10,485 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for MountPointService activated 2025-09-06T00:41:10,490 | INFO | features-3-thread-1 | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.13 | DOM Notification Router started 2025-09-06T00:41:10,493 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for NotificationService activated 2025-09-06T00:41:10,495 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for NotificationPublishService activated 2025-09-06T00:41:10,500 | INFO | features-3-thread-1 | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.13 | DOM RPC/Action router started 2025-09-06T00:41:10,502 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for RpcService activated 2025-09-06T00:41:10,505 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for RpcProviderService activated 2025-09-06T00:41:10,508 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for ActionService activated 2025-09-06T00:41:10,510 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for ActionProviderService activated 2025-09-06T00:41:10,510 | INFO | features-3-thread-1 | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | 8 DOMService trackers started 2025-09-06T00:41:10,511 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-eos-dom-api/14.0.13 2025-09-06T00:41:10,512 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.eos-dom-akka/11.0.0 2025-09-06T00:41:10,515 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.eos-binding-adapter/14.0.13 2025-09-06T00:41:10,516 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.mdsal-singleton-impl/14.0.13 2025-09-06T00:41:10,517 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.impl/0.20.0 2025-09-06T00:41:10,556 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T00:41:10,559 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.openflowplugin.impl/0.20.0. Missing service: [org.opendaylight.openflowplugin.api.openflow.statistics.ofpspecific.MessageIntelligenceAgency] 2025-09-06T00:41:10,566 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T00:41:10,569 | INFO | features-3-thread-1 | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-09-06T00:41:10,570 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.0 2025-09-06T00:41:10,570 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.0 2025-09-06T00:41:10,572 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.0. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-09-06T00:41:10,575 | INFO | features-3-thread-1 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.0 | ReconciliationManager started 2025-09-06T00:41:10,575 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.0 2025-09-06T00:41:10,576 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-api/0.20.0 2025-09-06T00:41:10,576 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl/0.20.0 2025-09-06T00:41:10,578 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.hk2.external.aopalliance-repackaged/2.6.1 2025-09-06T00:41:10,582 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.hk2.osgi-resource-locator/1.0.3 2025-09-06T00:41:10,625 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.h2database/2.3.232 2025-09-06T00:41:10,639 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.ssh-public-key-algs/14.0.13 2025-09-06T00:41:10,639 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.ssh-encryption-algs/14.0.13 2025-09-06T00:41:10,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.iana.ssh-key-exchange-algs/14.0.13 2025-09-06T00:41:10,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-common/14.0.13 2025-09-06T00:41:10,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.shiro-api/0.21.0 2025-09-06T00:41:10,643 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | karaf.branding/14.1.0 2025-09-06T00:41:10,643 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.encrypt-service/0.21.0 2025-09-06T00:41:10,643 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.cert/0.21.0 2025-09-06T00:41:10,648 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T00:41:10,649 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.scala-lang.scala-reflect/2.13.16.v20250107-233423-VFINAL-3f6bdae 2025-09-06T00:41:10,649 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.apache.karaf.jdbc.core/4.4.7 2025-09-06T00:41:10,660 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.7 2025-09-06T00:41:10,660 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.containers.jersey-container-servlet-core/2.47.0 2025-09-06T00:41:10,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.srm-api/0.20.0 2025-09-06T00:41:10,662 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.extension-onf/0.20.0 2025-09-06T00:41:10,664 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.web.api/0.21.0 2025-09-06T00:41:10,668 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.web.osgi-impl/0.21.0 2025-09-06T00:41:10,670 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.device-ownership-service/0.20.0 2025-09-06T00:41:10,671 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.libraries.liblldp/0.20.0 2025-09-06T00:41:10,674 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.model.ietf-topology/2013.10.21.26_13 2025-09-06T00:41:10,675 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 2025-09-06T00:41:10,681 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-09-06T00:41:10,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 2025-09-06T00:41:10,686 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T00:41:10,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.jolokia.osgi/1.7.2 2025-09-06T00:41:10,689 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-09-06T00:41:10,704 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@7a51606e,contexts=[{HS,OCM-5,context:1354494756,/}]} 2025-09-06T00:41:10,705 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@7a51606e,contexts=null}", size=3} 2025-09-06T00:41:10,706 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{HS,id=OCM-5,name='context:1354494756',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1354494756',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@50bbf324}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@3e24bac0{/,null,STOPPED} 2025-09-06T00:41:10,707 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@3e24bac0{/,null,STOPPED} 2025-09-06T00:41:10,707 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@7a51606e,contexts=[{HS,OCM-5,context:1354494756,/}]} 2025-09-06T00:41:10,713 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:1354494756',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1354494756',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@50bbf324}} 2025-09-06T00:41:10,734 | INFO | paxweb-config-1-thread-1 | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-09-06T00:41:10,766 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@3e24bac0{/,null,AVAILABLE} 2025-09-06T00:41:10,766 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:1354494756',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1354494756',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@50bbf324}}} as OSGi service for "/" context path 2025-09-06T00:41:10,771 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-datastores/14.0.13 2025-09-06T00:41:10,771 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc7952/14.0.13 2025-09-06T00:41:10,772 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-origin/14.0.13 2025-09-06T00:41:10,772 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8526/14.0.13 2025-09-06T00:41:10,773 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.srm-impl/0.20.0 2025-09-06T00:41:10,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 2025-09-06T00:41:10,784 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:10,791 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:10,791 | INFO | features-3-thread-1 | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | Registering openflowplugin service recovery handlers 2025-09-06T00:41:10,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-repo-fs/14.0.14 2025-09-06T00:41:10,793 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.cds-mgmt-api/11.0.0 2025-09-06T00:41:10,794 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.raft-journal/11.0.0 2025-09-06T00:41:10,794 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-client/14.0.13 2025-09-06T00:41:10,796 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-server/14.0.13 2025-09-06T00:41:10,796 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc8525/14.0.13 2025-09-06T00:41:10,797 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.of-switch-config-pusher/0.20.0 2025-09-06T00:41:10,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.cds-access-api/11.0.0 2025-09-06T00:41:10,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc6470/14.0.13 2025-09-06T00:41:10,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding.model.ietf.rfc7407-ietf-x509-cert-to-name/14.0.13 2025-09-06T00:41:10,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.model.draft-ietf-restconf-server/9.0.0 2025-09-06T00:41:10,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | javassist/3.30.2.GA 2025-09-06T00:41:10,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.bulk-o-matic/0.20.0 2025-09-06T00:41:10,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.authn-api/0.21.0 2025-09-06T00:41:10,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.password-service-api/0.21.0 2025-09-06T00:41:10,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.tokenauthrealm/0.21.0 2025-09-06T00:41:10,803 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.web.servlet-api/0.21.0 2025-09-06T00:41:10,803 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.password-service-impl/0.21.0 2025-09-06T00:41:10,806 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.idm-store-h2/0.21.0 2025-09-06T00:41:10,807 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.web.servlet-jersey2/0.21.0 2025-09-06T00:41:10,810 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.shiro/0.21.0 2025-09-06T00:41:10,814 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T00:41:10,840 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.0 [172]] 2025-09-06T00:41:10,841 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-09-06T00:41:10,841 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-09-06T00:41:10,841 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-09-06T00:41:10,843 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.rabbitmq.client/5.25.0 2025-09-06T00:41:10,844 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.model.topology/0.20.0 2025-09-06T00:41:10,844 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.yanglib-mdsal-writer/9.0.0 2025-09-06T00:41:10,846 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-tree-ri/14.0.14 2025-09-06T00:41:10,847 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.scala-lang.modules.scala-parser-combinators/1.1.2 2025-09-06T00:41:10,848 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.model.rfc5277/9.0.0 2025-09-06T00:41:10,849 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.odl-device-notification/9.0.0 2025-09-06T00:41:10,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-cluster-admin-api/11.0.0 2025-09-06T00:41:10,851 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-server/9.0.0 2025-09-06T00:41:10,852 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-nb/9.0.0 2025-09-06T00:41:10,856 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.atomix-storage/11.0.0 2025-09-06T00:41:10,856 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-akka-segmented-journal/11.0.0 2025-09-06T00:41:10,856 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.cds-dom-api/11.0.0 2025-09-06T00:41:10,857 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.jetty-auth-log-filter/0.21.0 2025-09-06T00:41:10,858 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.dropwizard.metrics.jvm/4.2.32 2025-09-06T00:41:10,858 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.dropwizard.metrics.healthchecks/4.2.32 2025-09-06T00:41:10,858 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.jspecify.jspecify/1.0.0 2025-09-06T00:41:10,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.dropwizard.metrics.graphite/4.2.32 2025-09-06T00:41:10,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.model.sal-remote/9.0.0 2025-09-06T00:41:10,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.containers.jersey-container-servlet/2.47.0 2025-09-06T00:41:10,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.truststore-none/9.0.0 2025-09-06T00:41:10,860 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.cds-access-client/11.0.0 2025-09-06T00:41:10,860 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.mdsal.binding-util/14.0.13 2025-09-06T00:41:10,860 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-subscription/9.0.0 2025-09-06T00:41:10,863 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.applications.topology-manager/0.20.0 2025-09-06T00:41:10,867 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.encrypt-service-impl/0.21.0 2025-09-06T00:41:10,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.keystore-none/9.0.0 2025-09-06T00:41:10,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | com.googlecode.json-simple/1.1.1 2025-09-06T00:41:10,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.transport-ssh/9.0.0 2025-09-06T00:41:10,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.yangtools.yang-data-transform/14.0.14 2025-09-06T00:41:10,873 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.openflowjava.blueprint-config/0.20.0 2025-09-06T00:41:10,875 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.srm-shell/0.20.0 2025-09-06T00:41:10,878 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.0. Missing service: [org.opendaylight.serviceutils.srm.spi.RegistryControl, org.opendaylight.mdsal.binding.api.DataBroker] 2025-09-06T00:41:10,878 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.infrautils.diagstatus-shell/7.1.4 2025-09-06T00:41:10,879 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.4 2025-09-06T00:41:10,879 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.hk2.locator/2.6.1 2025-09-06T00:41:10,881 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.inject.jersey-hk2/2.47.0 2025-09-06T00:41:10,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-akka-raft/11.0.0 2025-09-06T00:41:10,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.aaa.filterchain/0.21.0 2025-09-06T00:41:10,887 | INFO | features-3-thread-1 | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=110, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-09-06T00:41:10,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.hk2.utils/2.6.1 2025-09-06T00:41:10,888 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.hk2.api/2.6.1 2025-09-06T00:41:10,889 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | jakarta.ws.rs-api/2.1.6 2025-09-06T00:41:10,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-server-jaxrs/9.0.0 2025-09-06T00:41:10,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.sal-remote-impl/9.0.0 2025-09-06T00:41:10,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.restconf-server-mdsal/9.0.0 2025-09-06T00:41:10,898 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-clustering-commons/11.0.0 2025-09-06T00:41:10,901 | INFO | features-3-thread-1 | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | File-based Pekko configuration reader enabled 2025-09-06T00:41:10,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-distributed-datastore/11.0.0 2025-09-06T00:41:10,909 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Actor System provider starting 2025-09-06T00:41:11,120 | INFO | features-3-thread-1 | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating new ActorSystem 2025-09-06T00:41:11,409 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Slf4jLogger started 2025-09-06T00:41:11,684 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.226:2550] with UID [6751317067529774690] 2025-09-06T00:41:11,694 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Starting up, Pekko version [1.0.3] ... 2025-09-06T00:41:11,749 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-09-06T00:41:11,750 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Started up successfully 2025-09-06T00:41:11,781 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.226:2550#6751317067529774690], selfDc [default]. 2025-09-06T00:41:12,025 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Actor System provider started 2025-09-06T00:41:12,029 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.0 | Remote Operations service starting 2025-09-06T00:41:12,032 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.0 | Remote Operations service started 2025-09-06T00:41:12,034 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T00:41:12,034 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-09-06T00:41:12,116 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.195:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.195/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T00:41:12,117 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.195:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.195/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T00:41:12,175 | INFO | features-3-thread-1 | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore Context Introspector activated 2025-09-06T00:41:12,178 | INFO | features-3-thread-1 | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Shard configuration provider started 2025-09-06T00:41:12,181 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type CONFIGURATION starting 2025-09-06T00:41:12,226 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-33 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.0 | Initialized with root directory segmented-journal with storage MAPPED 2025-09-06T00:41:12,405 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1523469842]], but this node is not initialized yet 2025-09-06T00:41:12,662 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Create data store instance of type : config 2025-09-06T00:41:12,694 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T00:41:12,695 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T00:41:12,700 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating ShardManager : shardmanager-config 2025-09-06T00:41:12,722 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Starting ShardManager shard-manager-config 2025-09-06T00:41:12,732 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Recovery complete 2025-09-06T00:41:12,845 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: saving tombstone ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0} 2025-09-06T00:41:12,891 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Data store config is using tell-based protocol 2025-09-06T00:41:12,895 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T00:41:12,896 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T00:41:12,897 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type OPERATIONAL starting 2025-09-06T00:41:12,898 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Create data store instance of type : operational 2025-09-06T00:41:12,899 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating ShardManager : shardmanager-operational 2025-09-06T00:41:12,907 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Starting ShardManager shard-manager-operational 2025-09-06T00:41:12,912 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Recovery complete 2025-09-06T00:41:12,920 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: saving tombstone ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0} 2025-09-06T00:41:12,923 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Data store operational is using tell-based protocol 2025-09-06T00:41:12,926 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-topology-config: Shard created, persistent : true 2025-09-06T00:41:12,926 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-toaster-config: Shard created, persistent : true 2025-09-06T00:41:12,929 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-config: Shard created, persistent : true 2025-09-06T00:41:12,926 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | jakarta.activation-api/1.2.2 2025-09-06T00:41:12,931 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Shard created, persistent : true 2025-09-06T00:41:12,937 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: Shard created, persistent : false 2025-09-06T00:41:12,940 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.glassfish.jersey.core.jersey-common/2.47.0 2025-09-06T00:41:12,942 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-topology-operational: Shard created, persistent : false 2025-09-06T00:41:12,944 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-operational: Shard created, persistent : false 2025-09-06T00:41:12,952 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.common/4.2.2.Final 2025-09-06T00:41:12,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | io.netty.buffer/4.2.2.Final 2025-09-06T00:41:12,954 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-toaster-operational: Shard created, persistent : false 2025-09-06T00:41:12,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-api/0.20.0 2025-09-06T00:41:12,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin/0.20.0 2025-09-06T00:41:12,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.truststore-api/9.0.0 2025-09-06T00:41:12,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.controller.sal-cluster-admin-impl/11.0.0 2025-09-06T00:41:12,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.openflowplugin.blueprint-config/0.20.0 2025-09-06T00:41:12,963 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-default-config/member-3-shard-default-config-notifier#-1100975759 created and ready for shard:member-3-shard-default-config 2025-09-06T00:41:12,963 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-inventory-operational/member-3-shard-inventory-operational-notifier#-361439382 created and ready for shard:member-3-shard-inventory-operational 2025-09-06T00:41:12,964 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-topology-config/member-3-shard-topology-config-notifier#1239622810 created and ready for shard:member-3-shard-topology-config 2025-09-06T00:41:12,965 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-topology-operational/member-3-shard-topology-operational-notifier#603672131 created and ready for shard:member-3-shard-topology-operational 2025-09-06T00:41:12,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-default-operational/member-3-shard-default-operational-notifier#-2075957464 created and ready for shard:member-3-shard-default-operational 2025-09-06T00:41:12,967 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Starting recovery with journal batch size 1 2025-09-06T00:41:12,967 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Starting recovery with journal batch size 1 2025-09-06T00:41:12,968 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | org.opendaylight.netconf.keystore-api/9.0.0 2025-09-06T00:41:12,968 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Starting recovery with journal batch size 1 2025-09-06T00:41:12,968 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Starting recovery with journal batch size 1 2025-09-06T00:41:12,969 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Starting recovery with journal batch size 1 2025-09-06T00:41:12,969 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Starting recovery with journal batch size 1 2025-09-06T00:41:12,969 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-inventory-config/member-3-shard-inventory-config-notifier#-1557965179 created and ready for shard:member-3-shard-inventory-config 2025-09-06T00:41:12,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-toaster-config/member-3-shard-toaster-config-notifier#1032726501 created and ready for shard:member-3-shard-toaster-config 2025-09-06T00:41:12,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-toaster-operational/member-3-shard-toaster-operational-notifier#1880227783 created and ready for shard:member-3-shard-toaster-operational 2025-09-06T00:41:12,970 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Starting recovery with journal batch size 1 2025-09-06T00:41:12,973 | INFO | features-3-thread-1 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.7 | Deployment finished. Registering FeatureDeploymentListener 2025-09-06T00:41:12,980 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-46 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.0 | Initialized with root directory segmented-journal with storage DISK 2025-09-06T00:41:12,984 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Starting recovery with journal batch size 1 2025-09-06T00:41:13,052 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: journal open: applyTo=0 2025-09-06T00:41:13,052 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: journal open: applyTo=0 2025-09-06T00:41:13,053 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: journal open: applyTo=0 2025-09-06T00:41:13,053 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: journal open: applyTo=0 2025-09-06T00:41:13,053 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: journal open: applyTo=0 2025-09-06T00:41:13,054 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: journal open: applyTo=0 2025-09-06T00:41:13,054 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: journal open: applyTo=0 2025-09-06T00:41:13,054 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: journal open: applyTo=0 2025-09-06T00:41:13,091 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,092 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,093 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,094 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,094 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,094 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,095 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,097 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T00:41:13,101 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,103 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,103 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,103 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,104 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,104 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,104 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,104 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Local TermInfo store seeded with TermInfo{term=0} 2025-09-06T00:41:13,108 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , received role change from null to Follower 2025-09-06T00:41:13,109 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from null to Follower 2025-09-06T00:41:13,110 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , received role change from null to Follower 2025-09-06T00:41:13,110 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from null to Follower 2025-09-06T00:41:13,110 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from null to Follower 2025-09-06T00:41:13,112 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from null to Follower 2025-09-06T00:41:13,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T00:41:13,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T00:41:13,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T00:41:13,117 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-toaster-operational from null to Follower 2025-09-06T00:41:13,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T00:41:13,118 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from null to Follower 2025-09-06T00:41:13,118 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-inventory-operational from null to Follower 2025-09-06T00:41:13,118 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from null to Follower 2025-09-06T00:41:13,118 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T00:41:13,118 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , received role change from null to Follower 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from null to Follower 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-topology-operational from null to Follower 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from null to Follower 2025-09-06T00:41:13,119 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from null to Follower 2025-09-06T00:41:13,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T00:41:13,120 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from null to Follower 2025-09-06T00:41:13,281 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.7 | Done. 2025-09-06T00:41:15,303 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1638567269]], but this node is not initialized yet 2025-09-06T00:41:15,304 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1638567269]], but this node is not initialized yet 2025-09-06T00:41:17,183 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.0 | member-3 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#1732432706]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-09-06T00:41:22,209 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.0 | member-3 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#1732432706]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-09-06T00:41:23,144 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate): Starting new election term 1 2025-09-06T00:41:23,145 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,145 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Candidate): Starting new election term 1 2025-09-06T00:41:23,145 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,146 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from Follower to Candidate 2025-09-06T00:41:23,146 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from Follower to Candidate 2025-09-06T00:41:23,146 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from Follower to Candidate 2025-09-06T00:41:23,147 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from Follower to Candidate 2025-09-06T00:41:23,154 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational (Candidate): Starting new election term 1 2025-09-06T00:41:23,154 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,155 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , received role change from Follower to Candidate 2025-09-06T00:41:23,155 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-topology-operational from Follower to Candidate 2025-09-06T00:41:23,164 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Candidate): Starting new election term 1 2025-09-06T00:41:23,165 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,165 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from Follower to Candidate 2025-09-06T00:41:23,165 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from Follower to Candidate 2025-09-06T00:41:23,173 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate): Starting new election term 1 2025-09-06T00:41:23,174 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,174 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from Follower to Candidate 2025-09-06T00:41:23,174 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from Follower to Candidate 2025-09-06T00:41:23,174 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Candidate): Starting new election term 1 2025-09-06T00:41:23,175 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,175 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , received role change from Follower to Candidate 2025-09-06T00:41:23,175 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-toaster-operational from Follower to Candidate 2025-09-06T00:41:23,184 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Candidate): Starting new election term 1 2025-09-06T00:41:23,184 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,184 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from Follower to Candidate 2025-09-06T00:41:23,184 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from Follower to Candidate 2025-09-06T00:41:23,214 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Candidate): Starting new election term 1 2025-09-06T00:41:23,214 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-09-06T00:41:23,215 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , received role change from Follower to Candidate 2025-09-06T00:41:23,216 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-inventory-operational from Follower to Candidate 2025-09-06T00:41:26,518 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1523469842]], but this node is not initialized yet 2025-09-06T00:41:27,229 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.0 | member-3 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#1732432706]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-09-06T00:41:27,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinAck message from [Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/system/cluster/core/daemon#-1335179633]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2025-09-06T00:41:27,391 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Welcome from [pekko://opendaylight-cluster-data@10.30.171.195:2550] 2025-09-06T00:41:27,395 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.195:2550 2025-09-06T00:41:27,395 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-default-config 2025-09-06T00:41:27,395 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-topology-config 2025-09-06T00:41:27,395 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-09-06T00:41:27,395 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-09-06T00:41:27,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-default-config 2025-09-06T00:41:27,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-topology-config 2025-09-06T00:41:27,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T00:41:27,397 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.195:2550 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T00:41:27,398 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T00:41:27,399 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T00:41:27,400 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T00:41:27,409 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#1173784058] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T00:41:27,410 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#86549496] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T00:41:27,475 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Singleton identified at [pekko://opendaylight-cluster-data@10.30.171.195:2550/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-09-06T00:41:27,531 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$c] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$c] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T00:41:27,532 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$a] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$a] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T00:41:27,532 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$b] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$b] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T00:41:29,367 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2025-09-06T00:41:29,367 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-default-config 2025-09-06T00:41:29,368 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-topology-config 2025-09-06T00:41:29,368 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-09-06T00:41:29,368 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-09-06T00:41:29,368 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2025-09-06T00:41:29,369 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-09-06T00:41:29,369 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-09-06T00:41:29,369 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-09-06T00:41:29,369 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-09-06T00:41:29,369 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-09-06T00:41:29,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | ClusterSingletonManager state change [Start -> Younger] 2025-09-06T00:41:30,102 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-09-06T00:41:32,056 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,059 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,072 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,072 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from Candidate to Follower 2025-09-06T00:41:32,073 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from Candidate to Follower 2025-09-06T00:41:32,074 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,075 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from Candidate to Follower 2025-09-06T00:41:32,075 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from Candidate to Follower 2025-09-06T00:41:32,092 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,093 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,094 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@51ba6882 2025-09-06T00:41:32,095 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-toaster-config status sync done false 2025-09-06T00:41:32,096 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@231f865a 2025-09-06T00:41:32,098 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-default-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,098 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-default-operational status sync done false 2025-09-06T00:41:32,105 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,105 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,105 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from Candidate to Follower 2025-09-06T00:41:32,106 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from Candidate to Follower 2025-09-06T00:41:32,107 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4a274ed7 2025-09-06T00:41:32,107 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,107 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , received role change from Candidate to Follower 2025-09-06T00:41:32,107 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-topology-config status sync done false 2025-09-06T00:41:32,107 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-toaster-operational from Candidate to Follower 2025-09-06T00:41:32,108 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6c35ab76 2025-09-06T00:41:32,108 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done false 2025-09-06T00:41:32,111 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from Candidate to Follower 2025-09-06T00:41:32,111 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from Candidate to Follower 2025-09-06T00:41:32,112 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-default-config status sync done false 2025-09-06T00:41:32,112 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@47961757 2025-09-06T00:41:32,117 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,117 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , received role change from Candidate to Follower 2025-09-06T00:41:32,117 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-inventory-operational from Candidate to Follower 2025-09-06T00:41:32,118 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4b5f7c3c 2025-09-06T00:41:32,118 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done false 2025-09-06T00:41:32,139 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,142 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-inventory-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-09-06T00:41:32,151 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , received role change from Candidate to Follower 2025-09-06T00:41:32,152 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-topology-operational from Candidate to Follower 2025-09-06T00:41:32,152 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-09-06T00:41:32,152 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from Candidate to Follower 2025-09-06T00:41:32,152 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from Candidate to Follower 2025-09-06T00:41:32,165 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-inventory-config status sync done false 2025-09-06T00:41:32,165 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@147ca7b3 2025-09-06T00:41:32,166 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-09-06T00:41:32,166 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-topology-operational status sync done false 2025-09-06T00:41:32,166 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@200ca9e6 2025-09-06T00:41:32,166 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: All Shards are ready - data store config is ready 2025-09-06T00:41:32,169 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore service type OPERATIONAL activated 2025-09-06T00:41:32,170 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type OPERATIONAL started 2025-09-06T00:41:32,185 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.14 | ThreadFactory created: CommitFutures 2025-09-06T00:41:32,187 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | DOM Data Broker commit exector started 2025-09-06T00:41:32,187 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore service type CONFIGURATION activated 2025-09-06T00:41:32,189 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | DOM Data Broker started 2025-09-06T00:41:32,193 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for DataBroker activated 2025-09-06T00:41:32,285 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#1555359261], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-09-06T00:41:32,287 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#1555359261], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T00:41:32,306 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.0 | ArbitratorReconciliationManager has started successfully. 2025-09-06T00:41:32,309 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#1555359261], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 23.23 ms 2025-09-06T00:41:32,319 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), Initial app config AaaCertServiceConfig] 2025-09-06T00:41:32,328 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [Initial app config TopologyLldpDiscoveryConfig, (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:32,336 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [Initial app config LldpSpeakerConfig] 2025-09-06T00:41:32,358 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#-1116809748], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-09-06T00:41:32,359 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#-1116809748], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T00:41:32,359 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#-1116809748], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 357.5 μs 2025-09-06T00:41:32,365 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [Initial app config ForwardingRulesManagerConfig, (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:32,368 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-09-06T00:41:32,369 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | DeviceOwnershipService started 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | rpc-requests-quota configuration property was changed to '20000' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | global-notification-quota configuration property was changed to '64000' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | switch-features-mandatory configuration property was changed to 'false' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | enable-flow-removed-notification configuration property was changed to 'true' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | barrier-count-limit configuration property was changed to '25600' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | echo-reply-timeout configuration property was changed to '2000' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,376 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | skip-table-features configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | basic-timer-delay configuration property was changed to '3000' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | maximum-timer-delay configuration property was changed to '900000' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | use-single-layer-serialization configuration property was changed to 'true' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-min-threads configuration property was changed to '1' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-max-threads configuration property was changed to '32000' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-timeout configuration property was changed to '60' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-datastore-removal-delay configuration property was changed to '500' 2025-09-06T00:41:32,377 | INFO | Blueprint Extender: 3 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-09-06T00:41:32,382 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-09-06T00:41:32,383 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:32,388 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T00:41:32,396 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-09-06T00:41:32,396 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-09-06T00:41:32,407 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 has been started 2025-09-06T00:41:32,415 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-default-operational status sync done true 2025-09-06T00:41:32,417 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.0 [309] was successfully created 2025-09-06T00:41:32,451 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.0 | DefaultConfigPusher has started. 2025-09-06T00:41:32,455 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-default-config status sync done true 2025-09-06T00:41:32,494 | INFO | Blueprint Extender: 1 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.0 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-09-06T00:41:32,506 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@3e11c310 was registered as configuration listener to OpenFlowPlugin configuration service 2025-09-06T00:41:32,527 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@3c995950 was registered as configuration listener to OpenFlowPlugin configuration service 2025-09-06T00:41:32,564 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | Listening for password service configuration 2025-09-06T00:41:32,574 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.0 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-09-06T00:41:32,577 | ERROR | opendaylight-cluster-data-notification-dispatcher-52 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | bundle org.opendaylight.aaa.idm-store-h2:0.21.0 (167)[org.opendaylight.aaa.datastore.h2.H2Store(84)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-09-06T00:41:32,581 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will utilize default iteration count=20000 2025-09-06T00:41:32,581 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-09-06T00:41:32,581 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-09-06T00:41:32,514 | WARN | opendaylight-cluster-data-notification-dispatcher-50 | DefaultUpgradeState | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.0 | Failed to write operational state java.util.concurrent.ExecutionException: OptimisticLockFailedException{message=Optimistic lock failed for path /(urn:opendaylight:serviceutils:upgrade?revision=2018-07-02)upgrade-config, errorList=[RpcError [message=Optimistic lock failed for path /(urn:opendaylight:serviceutils:upgrade?revision=2018-07-02)upgrade-config, severity=ERROR, errorType=APPLICATION, tag=resource-denied, applicationTag=null, info=null, cause=org.opendaylight.yangtools.yang.data.tree.api.ConflictingModificationAppliedException: Node was created by other transaction.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:96) ~[bundleFile:?] at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:66) ~[bundleFile:?] at com.google.common.util.concurrent.ForwardingFluentFuture.get(ForwardingFluentFuture.java:67) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.arbitratorreconciliation.impl.DefaultUpgradeState.dataChangedTo(DefaultUpgradeState.java:75) ~[?:?] at org.opendaylight.openflowplugin.applications.arbitratorreconciliation.impl.DefaultUpgradeState.dataChangedTo(DefaultUpgradeState.java:34) ~[?:?] at org.opendaylight.mdsal.binding.api.DataListenerAdapter.onInitialData(DataListenerAdapter.java:30) ~[bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMDataTreeChangeListenerAdapter.onInitialData(BindingDOMDataTreeChangeListenerAdapter.java:65) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.DataTreeChangeListenerActor.onInitialData(DataTreeChangeListenerActor.java:70) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.DataTreeChangeListenerActor.handleReceive(DataTreeChangeListenerActor.java:56) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) [bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:270) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.OptimisticLockFailedException: Optimistic lock failed for path /(urn:opendaylight:serviceutils:upgrade?revision=2018-07-02)upgrade-config at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:850) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.processNextPendingTransaction(ShardDataTree.java:829) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.startCanCommit(ShardDataTree.java:992) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.CommitCohort.canCommit(CommitCohort.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.directCommit(FrontendReadWriteTransaction.java:426) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.handleModifyTransaction(FrontendReadWriteTransaction.java:595) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.doHandleRequest(FrontendReadWriteTransaction.java:197) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendTransaction.handleRequest(FrontendTransaction.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.AbstractFrontendHistory.handleTransactionRequest(AbstractFrontendHistory.java:122) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:133) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] ... 10 more Caused by: org.opendaylight.yangtools.yang.data.tree.api.ConflictingModificationAppliedException: Node was created by other transaction. at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkConflicting(SchemaAwareApplyOperation.java:69) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkWriteApplicable(SchemaAwareApplyOperation.java:172) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:102) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkChildPreconditions(AbstractNodeContainerModificationStrategy.java:441) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkTouchApplicable(AbstractNodeContainerModificationStrategy.java:400) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:101) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:615) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.lockedValidate(InMemoryDataTreeModification.java:625) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:603) ~[bundleFile:?] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractDataTreeTip.validate(AbstractDataTreeTip.java:33) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:843) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.processNextPendingTransaction(ShardDataTree.java:829) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.startCanCommit(ShardDataTree.java:992) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.CommitCohort.canCommit(CommitCohort.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.directCommit(FrontendReadWriteTransaction.java:426) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.handleModifyTransaction(FrontendReadWriteTransaction.java:595) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.doHandleRequest(FrontendReadWriteTransaction.java:197) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendTransaction.handleRequest(FrontendTransaction.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.AbstractFrontendHistory.handleTransactionRequest(AbstractFrontendHistory.java:122) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:133) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] ... 10 more 2025-09-06T00:41:32,605 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-toaster-config status sync done true 2025-09-06T00:41:32,607 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.0 | LLDPDiscoveryListener started. 2025-09-06T00:41:32,609 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 has been started 2025-09-06T00:41:32,609 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.0 [303] was successfully created 2025-09-06T00:41:32,609 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T00:41:32,611 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | H2 IDMStore activated 2025-09-06T00:41:32,623 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-topology-config status sync done true 2025-09-06T00:41:32,623 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done true 2025-09-06T00:41:32,633 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T00:41:32,634 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-09-06T00:41:32,634 | INFO | Blueprint Extender: 1 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.0 | NodeConnectorInventoryEventTranslator has started. 2025-09-06T00:41:32,636 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 has been started 2025-09-06T00:41:32,636 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done true 2025-09-06T00:41:32,636 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.0 [300] was successfully created 2025-09-06T00:41:32,646 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-09-06T00:41:32,652 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-09-06T00:41:32,673 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-topology-operational status sync done true 2025-09-06T00:41:32,673 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-inventory-config status sync done true 2025-09-06T00:41:32,685 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-09-06T00:41:32,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.13 | Cluster Singleton Service started 2025-09-06T00:41:32,695 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.0 | ietf-yang-library writer registered 2025-09-06T00:41:32,722 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.0 | Topology Manager service started. 2025-09-06T00:41:32,745 | INFO | Blueprint Extender: 3 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | ForwardingRulesManager has started successfully. 2025-09-06T00:41:32,747 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 has been started 2025-09-06T00:41:32,748 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.0 [299] was successfully created 2025-09-06T00:41:32,791 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | MD-SAL configuration-based SwitchConnectionProviders started 2025-09-06T00:41:32,795 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.0 2025-09-06T00:41:32,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-09-06T00:41:32,825 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-09-06T00:41:32,829 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2025-09-06T00:41:32,835 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-09-06T00:41:32,838 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2025-09-06T00:41:32,895 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.0 | Cluster Admin services started 2025-09-06T00:41:32,896 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type CONFIGURATION started 2025-09-06T00:41:32,935 | INFO | Blueprint Extender: 1 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.14 | Using lazy population for lists larger than 16 element(s) 2025-09-06T00:41:32,981 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.0 | AAAEncryptionService activated 2025-09-06T00:41:32,982 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.0 | Encryption Service enabled 2025-09-06T00:41:32,992 | INFO | Blueprint Extender: 1 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.0 | AaaCertMdsalProvider Initialized 2025-09-06T00:41:33,044 | INFO | Blueprint Extender: 1 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.0 | Certificate Manager service has been initialized 2025-09-06T00:41:33,047 | INFO | Blueprint Extender: 1 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.0 | AaaCert Rpc Service has been initialized 2025-09-06T00:41:33,063 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 has been started 2025-09-06T00:41:33,064 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.0 [163] was successfully created 2025-09-06T00:41:33,066 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Checking if default entries must be created in IDM store 2025-09-06T00:41:33,208 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#-574004840], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-09-06T00:41:33,209 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#-574004840], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-09-06T00:41:33,210 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#-574004840], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 1.266 ms 2025-09-06T00:41:33,248 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-09-06T00:41:33,249 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@7fe4ed44 2025-09-06T00:41:33,256 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.0 | ONF Extension Provider started. 2025-09-06T00:41:33,257 | INFO | opendaylight-cluster-data-notification-dispatcher-53 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-09-06T00:41:33,259 | INFO | opendaylight-cluster-data-notification-dispatcher-53 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@306b6b64 2025-09-06T00:41:33,305 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:41:33,306 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:41:33,311 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | Table AAA_DOMAINS does not exist, creating it 2025-09-06T00:41:33,452 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Created default domain 2025-09-06T00:41:33,458 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | Table AAA_ROLES does not exist, creating it 2025-09-06T00:41:33,504 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Created 'admin' role 2025-09-06T00:41:33,515 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Created 'user' role 2025-09-06T00:41:33,625 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | Table AAA_USERS does not exist, creating it 2025-09-06T00:41:33,663 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | Table AAA_GRANTS does not exist, creating it 2025-09-06T00:41:33,751 | INFO | Blueprint Extender: 3 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.0 | AAAShiroProvider Session Initiated 2025-09-06T00:41:33,856 | INFO | Blueprint Extender: 3 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.0 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-09-06T00:41:33,886 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-09-06T00:41:33,886 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=300, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-09-06T00:41:33,886 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-09-06T00:41:33,887 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=300, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@5734c96c{/auth,null,STOPPED} 2025-09-06T00:41:33,889 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@5734c96c{/auth,null,STOPPED} 2025-09-06T00:41:33,890 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T00:41:33,891 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-09-06T00:41:33,891 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.aaa.shiro_0.21.0 [172] registered context path /auth with 4 service(s) 2025-09-06T00:41:33,892 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T00:41:33,893 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=300, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-09-06T00:41:33,896 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Initializing CustomFilterAdapter 2025-09-06T00:41:33,897 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Injecting a new filter chain with 0 Filters: 2025-09-06T00:41:33,897 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@5734c96c{/auth,null,AVAILABLE} 2025-09-06T00:41:33,897 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=300, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-09-06T00:41:33,898 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T00:41:33,900 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T00:41:33,900 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-09-06T00:41:33,900 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T00:41:33,901 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T00:41:33,901 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T00:41:33,901 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2025-09-06T00:41:33,901 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T00:41:33,904 | ERROR | Blueprint Extender: 3 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.0 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.0 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(118)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-09-06T00:41:33,972 | INFO | Blueprint Extender: 3 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278]] 2025-09-06T00:41:33,973 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-09-06T00:41:33,973 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-09-06T00:41:33,973 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-09-06T00:41:33,974 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@1f60546{/rests,null,STOPPED} 2025-09-06T00:41:33,975 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278] registered context path /rests with 4 service(s) 2025-09-06T00:41:33,975 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@1f60546{/rests,null,STOPPED} 2025-09-06T00:41:33,976 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T00:41:33,976 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-09-06T00:41:33,976 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T00:41:33,976 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T00:41:33,976 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-09-06T00:41:33,977 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Initializing CustomFilterAdapter 2025-09-06T00:41:33,977 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Injecting a new filter chain with 0 Filters: 2025-09-06T00:41:33,977 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278] registered context path /.well-known with 3 service(s) 2025-09-06T00:41:33,977 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@1f60546{/rests,null,AVAILABLE} 2025-09-06T00:41:33,977 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-09-06T00:41:33,978 | INFO | Blueprint Extender: 3 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.0 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@1eb3780b 2025-09-06T00:41:33,978 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T00:41:33,978 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T00:41:33,978 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-09-06T00:41:33,978 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T00:41:33,979 | INFO | paxweb-config-1-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2025-09-06T00:41:33,980 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-09-06T00:41:33,980 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2025-09-06T00:41:33,980 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@20ce6c27{/.well-known,null,STOPPED} 2025-09-06T00:41:33,981 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@20ce6c27{/.well-known,null,STOPPED} 2025-09-06T00:41:33,981 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-24,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2025-09-06T00:41:33,981 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-24,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=2} 2025-09-06T00:41:33,981 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T00:41:33,981 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T00:41:33,982 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /.well-known 2025-09-06T00:41:33,982 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-09-06T00:41:33,982 | INFO | paxweb-config-1-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@20ce6c27{/.well-known,null,AVAILABLE} 2025-09-06T00:41:33,982 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-09-06T00:41:33,983 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T00:41:33,983 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2025-09-06T00:41:33,983 | INFO | paxweb-config-1-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=1} 2025-09-06T00:41:33,983 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2025-09-06T00:41:34,002 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-09-06T00:41:34,003 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-09-06T00:41:34,003 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-09-06T00:41:34,004 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-09-06T00:41:34,027 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.0 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-09-06T00:41:34,027 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.0 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-09-06T00:41:34,080 | INFO | Blueprint Extender: 3 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.0 | Global RESTCONF northbound pools started 2025-09-06T00:41:34,081 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 has been started 2025-09-06T00:41:34,081 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.0 [172] was successfully created 2025-09-06T00:41:34,896 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | checkBundleDiagInfos: Elapsed time 24s, remaining time 275s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=399, STOPPING=0, FAILURE=0} 2025-09-06T00:41:34,896 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-09-06T00:41:34,896 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | Now notifying all its registered SystemReadyListeners... 2025-09-06T00:41:34,896 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | onSystemBootReady() received, starting the switch connections 2025-09-06T00:41:34,992 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-09-06T00:41:34,996 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-09-06T00:41:34,996 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@7fe4ed44 started 2025-09-06T00:41:34,997 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-09-06T00:41:34,997 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-09-06T00:41:34,997 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@306b6b64 started 2025-09-06T00:41:34,997 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | All switchConnectionProviders are up and running (2). 2025-09-06T00:44:21,767 | INFO | sshd-SshServer[4c5affee](port=8101)-nio2-thread-1 | OpenSSHKeyPairProvider | 121 - org.apache.karaf.shell.ssh - 4.4.7 | Creating ssh server private key at /tmp/karaf-0.23.0/etc/host.key 2025-09-06T00:44:21,770 | INFO | sshd-SshServer[4c5affee](port=8101)-nio2-thread-1 | OpenSSHKeyPairGenerator | 121 - org.apache.karaf.shell.ssh - 4.4.7 | generateKeyPair(RSA) generating host key - size=2048 2025-09-06T00:44:22,312 | INFO | sshd-SshServer[4c5affee](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Session karaf@/10.30.170.65:36644 authenticated 2025-09-06T00:44:23,741 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot 2025-09-06T00:44:24,384 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables 2025-09-06T00:44:27,639 | INFO | qtp674804643-477 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.0 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-09-06T00:44:27,642 | INFO | qtp674804643-477 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.0 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-09-06T00:44:28,486 | INFO | qtp674804643-477 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.0 | Authentication is now enabled 2025-09-06T00:44:28,486 | INFO | qtp674804643-477 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.0 | Authentication Manager activated 2025-09-06T00:44:28,532 | INFO | qtp674804643-477 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.0 | Consecutive slashes in REST URLs will be rejected 2025-09-06T00:44:32,764 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart 2025-09-06T00:44:33,860 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1 2025-09-06T00:44:37,591 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower 2025-09-06T00:44:37,936 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:44:37,949 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:44:38,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:44:38,386 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T00:44:38,457 | WARN | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Seems like device is still owned by other controller instance. Skip deleting openflow:1 node from operational datastore. 2025-09-06T00:44:39,166 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.lang.UnsupportedOperationException: null at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-09-06T00:44:39,173 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | null 2025-09-06T00:46:18,852 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader 2025-09-06T00:46:19,412 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,415 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,416 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,416 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,416 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,417 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,418 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,419 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,421 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,421 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,422 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,422 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:19,422 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:19,423 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,453 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,454 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,454 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,455 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,456 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,457 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,458 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,458 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,459 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,459 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,460 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,461 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,977 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,978 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:20,979 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:20,979 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:21,491 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:21,492 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:21,493 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:21,494 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:22,013 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:22,014 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:22,014 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:22,015 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:22,532 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:22,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:22,533 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:22,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:22,534 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:22,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:23,051 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:23,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:23,053 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:23,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:23,576 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:23,577 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:23,577 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:23,577 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:46:24,089 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T00:46:24,090 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T00:48:00,319 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart 2025-09-06T00:49:42,486 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit 2025-09-06T00:49:42,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:49:42,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:49:43,206 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T00:49:45,150 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1 2025-09-06T00:49:47,669 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:49:47,708 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-48 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:49:47,942 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1 2025-09-06T00:51:30,233 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 2025-09-06T00:51:30,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:51:30,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:51:30,996 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T00:51:32,933 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2 2025-09-06T00:51:35,310 | INFO | epollEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | ConnectionEvent: Connection closed by device, Device:/10.30.171.192:33798, NodeId:null 2025-09-06T00:51:35,344 | INFO | epollEventLoopGroup-5-2 | ConnectionAdapterImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Hello received 2025-09-06T00:51:35,351 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Device openflow:1 connected. 2025-09-06T00:51:35,352 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | No context chain found for device: openflow:1, creating new. 2025-09-06T00:51:35,352 | INFO | epollEventLoopGroup-5-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | ConnectionEvent: Device connected to controller, Device:/10.30.171.192:33810, NodeId:Uri{value=openflow:1} 2025-09-06T00:51:35,373 | INFO | epollEventLoopGroup-5-2 | RoleContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2025-09-06T00:51:35,689 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-09-06T00:51:35,752 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2 2025-09-06T00:51:35,969 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-09-06T00:51:35,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Starting DeviceContextImpl[NEW] service for node openflow:1 2025-09-06T00:51:35,979 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Starting RpcContextImpl[NEW] service for node openflow:1 2025-09-06T00:51:35,997 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2025-09-06T00:51:35,997 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Starting RoleContextImpl[NEW] service for node openflow:1 2025-09-06T00:51:35,999 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2025-09-06T00:51:35,999 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Requesting state change to BECOMEMASTER 2025-09-06T00:51:35,999 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2025-09-06T00:51:35,999 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | getGenerationIdFromDevice called for device: openflow:1 2025-09-06T00:51:36,004 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Started clustering services for node openflow:1 2025-09-06T00:51:36,004 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-09-06T00:51:36,006 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-09-06T00:51:36,013 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2025-09-06T00:51:36,023 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:37,043 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:38,063 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:39,082 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:40,102 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:41,123 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:42,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:43,165 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:44,183 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:45,203 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:46,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:47,242 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:48,262 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:49,282 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:50,303 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:51,322 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:52,343 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:53,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:54,382 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:55,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:56,423 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:57,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:58,463 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:51:59,482 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:00,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:01,523 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:02,542 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:03,563 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:04,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:05,603 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:06,622 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:07,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:08,662 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:09,683 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:10,704 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:11,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:12,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:13,764 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:14,795 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:15,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:16,834 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:17,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:18,875 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:19,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:20,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:21,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:22,953 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:23,974 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:24,993 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:26,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:27,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:28,054 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:29,073 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:30,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:31,113 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:32,133 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:33,153 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:34,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:35,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:36,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:37,233 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:38,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:39,274 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:40,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:41,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:42,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:43,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:44,373 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:45,393 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:46,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:47,433 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:48,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:49,474 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:50,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:51,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:52,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:53,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:54,573 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:55,593 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:56,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:57,633 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:58,653 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:52:59,673 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:00,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:01,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:02,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:03,754 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:04,774 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:05,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:06,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:07,833 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:08,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:09,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:10,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:11,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:12,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:13,953 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:14,974 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:15,993 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:17,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:18,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:18,100 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2 2025-09-06T00:53:18,131 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.MacAddress 2025-09-06T00:53:18,134 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.PhysAddress 2025-09-06T00:53:18,134 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.HexString 2025-09-06T00:53:18,135 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.DottedQuad 2025-09-06T00:53:18,135 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.Uuid 2025-09-06T00:53:18,230 | INFO | epollEventLoopGroup-5-2 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | ConnectionEvent: Connection closed by device, Device:/10.30.171.192:33810, NodeId:openflow:1 2025-09-06T00:53:18,231 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Device openflow:1 disconnected. 2025-09-06T00:53:18,231 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.0 | Stopping reconciliation for node Uri{value=openflow:1} 2025-09-06T00:53:18,231 | WARN | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Reconciliation framework failure for device openflow:1 java.util.concurrent.CancellationException: Task was cancelled. at com.google.common.util.concurrent.AbstractFuture.cancellationExceptionWithCause(AbstractFuture.java:1021) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:288) ~[?:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:235) ~[?:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[?:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[?:?] at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:246) ~[?:?] at com.google.common.util.concurrent.Futures.getDone(Futures.java:1175) ~[?:?] at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1123) ~[?:?] at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1004) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:767) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.cancel(AbstractFuture.java:372) ~[?:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.cancel(FluentFuture.java:120) ~[?:?] at org.opendaylight.openflowplugin.applications.reconciliation.impl.ReconciliationManagerImpl.cancelNodeReconciliation(ReconciliationManagerImpl.java:138) ~[?:?] at org.opendaylight.openflowplugin.applications.reconciliation.impl.ReconciliationManagerImpl.onDeviceDisconnected(ReconciliationManagerImpl.java:115) ~[?:?] at org.opendaylight.openflowplugin.impl.mastership.MastershipChangeServiceManagerImpl.becomeSlaveOrDisconnect(MastershipChangeServiceManagerImpl.java:101) ~[?:?] at org.opendaylight.openflowplugin.impl.lifecycle.ContextChainHolderImpl.destroyContextChain(ContextChainHolderImpl.java:363) ~[?:?] at org.opendaylight.openflowplugin.impl.lifecycle.ContextChainHolderImpl.onDeviceDisconnected(ContextChainHolderImpl.java:273) ~[?:?] at org.opendaylight.openflowplugin.impl.connection.ConnectionContextImpl.propagateDeviceDisconnectedEvent(ConnectionContextImpl.java:179) ~[?:?] at org.opendaylight.openflowplugin.impl.connection.ConnectionContextImpl.disconnectDevice(ConnectionContextImpl.java:168) ~[?:?] at org.opendaylight.openflowplugin.impl.connection.ConnectionContextImpl.onConnectionClosed(ConnectionContextImpl.java:126) ~[?:?] at org.opendaylight.openflowplugin.impl.connection.listener.SystemNotificationsListenerImpl.onDisconnect(SystemNotificationsListenerImpl.java:86) ~[?:?] at org.opendaylight.openflowjava.protocol.impl.core.connection.ConnectionAdapterImpl.consumeDeviceMessage(ConnectionAdapterImpl.java:121) ~[?:?] at org.opendaylight.openflowjava.protocol.impl.core.connection.AbstractConnectionAdapterStatistics.consume(AbstractConnectionAdapterStatistics.java:68) ~[?:?] at org.opendaylight.openflowjava.protocol.impl.core.connection.ConnectionAdapterImpl.consume(ConnectionAdapterImpl.java:62) ~[?:?] at org.opendaylight.openflowjava.protocol.impl.core.DelegatingInboundHandler.channelInactive(DelegatingInboundHandler.java:53) ~[?:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:251) ~[?:?] at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81) ~[?:?] at org.opendaylight.openflowjava.protocol.impl.core.connection.AbstractOutboundQueueManager.channelInactive(AbstractOutboundQueueManager.java:169) ~[?:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:251) ~[?:?] at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81) ~[?:?] at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:284) ~[?:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:251) ~[?:?] at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:412) ~[?:?] at io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:377) ~[?:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:251) ~[?:?] at io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:412) ~[?:?] at io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:377) ~[?:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:251) ~[?:?] at io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1424) ~[?:?] at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:876) ~[?:?] at io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:684) ~[?:?] at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:148) ~[?:?] at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:141) ~[?:?] at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:507) ~[?:?] at io.netty.channel.SingleThreadIoEventLoop.run(SingleThreadIoEventLoop.java:182) ~[?:?] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:1073) ~[?:?] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[?:?] at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[?:?] at java.lang.Thread.run(Thread.java:1583) [?:?] 2025-09-06T00:53:18,234 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.0 | Stopping reconciliation for node Uri{value=openflow:1} 2025-09-06T00:53:18,236 | WARN | pool-22-thread-1 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | Fail with read Config/DS for Node DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] @ urn.opendaylight.flow.inventory.rev130819.FlowCapableNode ] ! java.lang.InterruptedException: null at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:249) ~[?:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[?:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[?:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[?:?] at org.opendaylight.openflowplugin.applications.frm.impl.FlowNodeReconciliationImpl$ReconciliationTask.call(FlowNodeReconciliationImpl.java:354) ~[?:?] at org.opendaylight.openflowplugin.applications.frm.impl.FlowNodeReconciliationImpl$ReconciliationTask.call(FlowNodeReconciliationImpl.java:336) ~[?:?] at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:128) ~[?:?] at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:74) ~[?:?] at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:80) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1583) [?:?] 2025-09-06T00:53:18,239 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.0 | Stopping reconciliation for node Uri{value=openflow:1} 2025-09-06T00:53:18,239 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Role SLAVE was granted to device openflow:1 2025-09-06T00:53:18,239 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2025-09-06T00:53:18,242 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2025-09-06T00:53:18,242 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping running statistics gathering for node openflow:1 2025-09-06T00:53:18,244 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2025-09-06T00:53:18,244 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2025-09-06T00:53:18,247 | INFO | epollEventLoopGroup-5-2 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Closed clustering services registration for node openflow:1 2025-09-06T00:53:18,247 | INFO | ofppool-0 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Closed clustering services for node openflow:1 2025-09-06T00:53:18,247 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2025-09-06T00:53:18,248 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2025-09-06T00:53:18,248 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2025-09-06T00:53:18,248 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Stopping running statistics gathering for node openflow:1 2025-09-06T00:53:18,248 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2025-09-06T00:53:18,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-09-06T00:53:18,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-09-06T00:53:18,944 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T00:53:19,053 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:20,073 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:20,778 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader 2025-09-06T00:53:21,097 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:22,123 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:23,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:23,567 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader 2025-09-06T00:53:23,889 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:53:23,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T00:53:24,162 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:25,183 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:26,203 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:27,223 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:28,241 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:29,263 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:30,282 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:31,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:32,323 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:33,343 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:34,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:35,382 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:36,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:37,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:38,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:39,463 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:40,483 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:41,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:42,522 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:43,545 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:44,156 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T00:53:44,563 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:45,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:46,602 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:47,623 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:48,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:49,664 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:50,682 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:51,702 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:52,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:53,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:54,763 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:55,784 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:56,802 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:57,822 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:58,842 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:53:59,862 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:00,883 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:01,903 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:02,922 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:03,943 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:04,962 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:05,982 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:07,001 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:08,023 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:09,042 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:10,062 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:11,082 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:12,103 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:13,122 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:14,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:15,163 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:16,182 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:17,202 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:18,223 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:19,242 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:20,262 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:21,282 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:22,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:23,322 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:24,342 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:25,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:26,383 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:27,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:28,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:29,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:30,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:31,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:32,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:33,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:34,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:35,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:36,593 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:37,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:38,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:39,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:40,673 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:41,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:42,713 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:43,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:44,753 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:45,773 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:46,793 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:47,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:48,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:49,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:50,873 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:51,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:52,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:53,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:54,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:55,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:56,994 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:58,013 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:54:59,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:00,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:01,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:02,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:03,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:04,133 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:05,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:05,848 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader 2025-09-06T00:55:06,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:06,239 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:55:06,239 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T00:55:06,745 | INFO | node-cleaner-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T00:55:07,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:08,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:08,533 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1 2025-09-06T00:55:09,237 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:10,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:11,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:12,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:13,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:14,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:15,353 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:16,373 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:17,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:18,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:19,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:20,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:21,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:22,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:23,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:24,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:25,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:26,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:27,593 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:28,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:29,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:30,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:31,673 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:32,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:33,713 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:34,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:35,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:36,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:37,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:38,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:39,834 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:40,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:41,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:42,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:43,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:44,933 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:45,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:46,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:47,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:49,016 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:50,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:51,054 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:52,073 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:53,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:54,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:55,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:56,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:57,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:58,193 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:55:59,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:00,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:01,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:02,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:03,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:04,313 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:05,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:06,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:07,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:08,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:09,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:10,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:11,455 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:12,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:13,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:14,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:15,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:16,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:17,571 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:18,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:19,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:20,635 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:21,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:22,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:23,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:24,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:25,734 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:26,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:27,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:28,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:29,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:30,833 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:31,851 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:32,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:33,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:34,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:35,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:36,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:37,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:38,994 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:40,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:41,036 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:42,051 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:43,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:44,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:45,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:46,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:47,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:48,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:49,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:49,737 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader 2025-09-06T00:56:50,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:51,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:52,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:53,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:54,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:55,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:56,333 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:57,351 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:58,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:56:59,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:00,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:01,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:02,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:03,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:04,493 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:05,514 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:06,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:07,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:08,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:09,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:10,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:11,633 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:12,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:13,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:14,693 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:15,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:16,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:17,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:18,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:19,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:20,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:21,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:22,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:23,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:24,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:25,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:26,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:27,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:28,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:29,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:31,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:32,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:33,053 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:34,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:35,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:36,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:37,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:38,153 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:39,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:40,193 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:41,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:42,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:43,255 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:44,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:45,291 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:46,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:47,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:48,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:49,373 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:50,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:51,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:52,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:53,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:54,475 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:55,493 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:56,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:57,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:58,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:57:59,574 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:00,594 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:01,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:02,635 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:03,655 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:04,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:05,693 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:06,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:07,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:08,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:09,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:10,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:11,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:12,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:13,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:14,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:15,893 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:16,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:17,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:18,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:19,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:20,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:22,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:23,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:24,053 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:25,074 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:26,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:27,113 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:28,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:29,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:30,173 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:31,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:32,215 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:33,231 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:33,371 | INFO | sshd-SshServer[4c5affee](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Session karaf@/10.30.170.65:42714 authenticated 2025-09-06T00:58:33,995 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot 2025-09-06T00:58:34,253 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:34,373 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables 2025-09-06T00:58:35,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:36,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:37,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:38,333 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:39,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:40,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:41,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:42,099 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart 2025-09-06T00:58:42,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:43,433 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:44,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:45,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:46,493 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:47,513 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:48,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:49,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:50,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:51,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:52,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:53,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:53,958 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node 2025-09-06T00:58:54,362 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown 2025-09-06T00:58:54,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:55,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:56,693 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:57,713 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:58,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:58:59,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:00,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:01,797 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:02,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:03,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:04,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:05,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:06,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:07,914 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:08,933 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:09,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:10,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:11,993 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:13,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:14,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:15,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:16,073 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:17,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:18,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:19,133 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:20,153 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:21,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:22,193 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:23,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:24,233 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:25,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:26,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:27,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:28,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:29,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:30,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:31,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:32,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:33,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:34,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:35,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:36,473 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:37,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:38,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:39,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:40,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:41,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:42,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:43,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:44,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:45,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:46,674 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:47,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:48,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:49,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:50,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:51,773 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:52,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:53,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:54,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:55,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:56,873 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:57,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:58,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T00:59:59,933 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:00,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:01,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:02,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:06,362 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 3888 millis 2025-09-06T01:00:06,368 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:07,381 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:08,401 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:09,421 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:10,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:11,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:12,482 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:13,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:14,521 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:15,541 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:16,562 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:17,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:18,602 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:19,621 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:20,641 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:21,661 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:22,682 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:23,701 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:24,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:25,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:26,762 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:27,782 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:28,802 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:29,821 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:30,841 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:31,862 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:32,881 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:33,901 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:34,921 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:35,302 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown 2025-09-06T01:00:35,692 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1 2025-09-06T01:00:35,941 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:36,074 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower 2025-09-06T01:00:36,466 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader 2025-09-06T01:00:36,892 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart 2025-09-06T01:00:36,963 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:37,291 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node 2025-09-06T01:00:37,674 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart 2025-09-06T01:00:37,983 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:39,001 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:40,021 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:41,042 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:42,061 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:43,082 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:44,102 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:45,121 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:46,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:47,161 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:48,182 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:49,202 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:50,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:51,242 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:52,261 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:53,281 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:54,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:55,322 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:56,341 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:57,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:58,381 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:00:59,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:00,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:01,441 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:02,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:03,481 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:04,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:05,526 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:06,541 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:07,564 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:08,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:09,601 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:10,621 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:11,641 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:12,662 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:13,683 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:14,701 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:15,721 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:16,741 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:17,761 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:18,781 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:19,802 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:20,821 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:21,841 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:22,861 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:23,880 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:24,901 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:25,921 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:26,941 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:27,961 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:28,982 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:30,001 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:31,021 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:32,042 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:33,062 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:34,083 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:35,102 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:36,121 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:37,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:38,162 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:39,182 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:40,201 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:41,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:42,241 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:43,261 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:44,281 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:45,301 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:46,321 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:47,341 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:48,361 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:49,383 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:50,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:51,421 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:52,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:53,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:54,482 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:55,501 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:56,521 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:57,542 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:58,561 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:01:59,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:00,601 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:01,622 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:02,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:03,663 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:04,682 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:05,702 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:06,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:07,741 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:08,761 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:09,781 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:10,801 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:11,821 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:12,841 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:13,861 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:14,881 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:15,902 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:16,921 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:17,942 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:18,960 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:19,989 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:21,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:22,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:23,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:24,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:25,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:26,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:27,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:28,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:29,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:30,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:31,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:32,233 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:33,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:34,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:35,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:36,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:37,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:38,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:39,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:40,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:41,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:42,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:43,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:44,470 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:45,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:46,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:47,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:48,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:49,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:50,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:51,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:52,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:53,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:54,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:55,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:56,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:57,731 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:58,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:02:59,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:00,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:01,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:02,835 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:03,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:04,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:05,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:06,911 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:07,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:08,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:09,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:10,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:12,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:13,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:14,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:15,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:16,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:17,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:18,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:19,154 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:20,173 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:21,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:22,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:23,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:24,254 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:25,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:26,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:27,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:28,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:29,353 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:30,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:31,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:32,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:33,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:34,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:35,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:36,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:37,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:38,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:39,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:40,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:41,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:42,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:43,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:44,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:45,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:46,693 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:47,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:48,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:49,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:50,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:51,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:52,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:53,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:54,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:55,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:56,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:57,911 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:58,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:03:59,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:00,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:01,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:03,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:04,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:05,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:06,073 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:07,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:08,113 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:09,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:10,153 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:11,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:12,193 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:13,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:14,233 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:15,254 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:16,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:17,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:18,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:19,333 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:20,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:21,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:22,395 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:23,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:24,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:25,453 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:26,474 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:27,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:28,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:29,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:30,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:31,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:32,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:33,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:34,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:35,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:36,673 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:37,693 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:38,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:39,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:40,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:41,774 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:42,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:43,813 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:44,833 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:45,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:46,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:47,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:48,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:49,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:50,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:51,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:52,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:54,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:55,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:56,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:57,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:58,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:04:59,113 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:00,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:01,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:02,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:03,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:04,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:05,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:06,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:07,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:08,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:09,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:10,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:11,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:12,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:13,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:14,413 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:15,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:16,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:17,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:18,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:19,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:20,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:21,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:22,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:23,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:24,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:25,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:26,653 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:27,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:28,691 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:29,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:30,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:31,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:32,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:33,793 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:34,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:35,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:36,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:37,874 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:38,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:39,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:40,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:41,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:42,974 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:43,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:45,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:46,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:47,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:48,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:49,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:50,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:51,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:52,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:53,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:54,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:55,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:56,240 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:57,261 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:58,284 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:05:59,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:00,322 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:01,342 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:02,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:03,382 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:04,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:05,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:06,443 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:07,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:08,483 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:09,501 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:10,521 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:11,542 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:12,562 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:13,584 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:14,603 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:15,622 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:16,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:17,661 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:18,681 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:19,701 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:20,721 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:21,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:22,763 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:23,782 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:24,803 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:25,822 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:26,841 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:27,862 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:28,882 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:29,902 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:30,922 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:31,941 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:32,962 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:33,984 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:35,004 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:36,023 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:36,807 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart 2025-09-06T01:06:37,045 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:38,062 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:39,082 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:40,102 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:41,121 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:42,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:43,162 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:44,182 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:45,202 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:46,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:47,241 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:48,261 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:49,281 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:50,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:51,321 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:52,341 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:53,362 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:54,381 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:55,403 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:56,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:57,443 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:58,463 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:06:59,482 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:00,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:01,522 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:02,541 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:03,561 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:04,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:05,601 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:06,622 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:07,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:08,667 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:09,682 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:10,702 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:11,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:12,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:13,762 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:14,783 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:15,801 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:16,821 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:17,841 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:18,862 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:19,882 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:20,902 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:21,922 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:22,942 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:23,962 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:24,982 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:26,002 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:27,021 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:28,041 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:29,062 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:30,081 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:31,101 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:32,121 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:33,142 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:34,162 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:35,182 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:36,201 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:37,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:38,242 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:39,266 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:40,281 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:41,302 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:42,322 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:43,341 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:44,361 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:45,382 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:46,402 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:47,422 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:48,442 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:49,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:50,481 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:51,502 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:52,521 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:53,542 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:54,562 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:55,582 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:56,601 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:57,623 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:58,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:07:59,662 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:00,681 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:01,702 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:02,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:03,742 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:04,761 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:05,781 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:06,802 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:07,821 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:08,842 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:09,862 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:10,881 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:11,901 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:12,922 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:13,942 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:14,961 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:15,983 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:17,001 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:18,023 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:18,189 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart 2025-09-06T01:08:18,569 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1 2025-09-06T01:08:18,931 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1 2025-09-06T01:08:19,043 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:19,293 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node 2025-09-06T01:08:19,695 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart 2025-09-06T01:08:20,063 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:21,082 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:22,102 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:23,121 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:24,141 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:25,161 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:26,183 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:27,202 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:28,222 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:29,241 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:30,262 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:31,282 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:31,498 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node 2025-09-06T01:08:31,901 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown 2025-09-06T01:08:32,345 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:34,911 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 3349 millis 2025-09-06T01:08:34,912 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.195:2550: 3351 millis 2025-09-06T01:08:34,914 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:35,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:36,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:37,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:38,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:40,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:41,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:42,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:43,071 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:44,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:45,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:46,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:47,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:48,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:49,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:50,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:51,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:52,253 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:53,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:54,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:55,313 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:56,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:57,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:58,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:08:59,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:00,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:01,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:02,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:03,473 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:04,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:05,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:06,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:07,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:08,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:09,591 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:10,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:11,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:12,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:13,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:14,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:15,711 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:16,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:17,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:18,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:19,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:20,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:21,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:22,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:23,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:24,893 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:25,911 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:26,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:27,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:28,971 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:29,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:31,013 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:32,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:33,051 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:34,071 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:35,096 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:36,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:37,133 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:38,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:39,171 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:40,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:41,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:42,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:43,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:44,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:45,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:46,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:47,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:48,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:49,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:50,393 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:51,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:52,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:53,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:54,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:55,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:56,513 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:57,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:58,553 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:09:59,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:00,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:01,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:02,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:03,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:04,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:05,691 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:06,711 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:07,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:08,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:09,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:10,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:11,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:12,763 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node 2025-09-06T01:10:12,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:13,150 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1 2025-09-06T01:10:13,585 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart 2025-09-06T01:10:13,853 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:13,991 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart 2025-09-06T01:10:14,346 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node 2025-09-06T01:10:14,737 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart 2025-09-06T01:10:14,873 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:15,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:16,911 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:17,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:18,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:19,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:20,993 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:22,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:23,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:24,051 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:25,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:26,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:27,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:28,133 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:29,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:30,171 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:31,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:32,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:33,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:34,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:35,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:36,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:37,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:38,331 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:39,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:40,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:41,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:42,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:43,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:44,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:45,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:46,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:47,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:48,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:49,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:50,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:51,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:52,611 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:53,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:54,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:55,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:56,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:57,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:58,733 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:10:59,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:00,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:01,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:02,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:03,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:04,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:05,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:06,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:07,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:08,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:09,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:10,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:11,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:13,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:14,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:15,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:16,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:17,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:18,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:19,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:20,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:21,171 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:22,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:23,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:24,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:25,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:26,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:27,291 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:28,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:29,331 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:30,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:31,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:32,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:33,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:34,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:35,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:36,470 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:37,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:38,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:39,531 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:40,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:41,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:42,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:43,611 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:44,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:45,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:46,672 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:47,694 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:48,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:49,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:50,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:51,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:52,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:53,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:54,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:55,851 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:56,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:57,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:58,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:11:59,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:00,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:01,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:02,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:04,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:05,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:06,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:07,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:08,093 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:09,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:10,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:11,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:12,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:13,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:14,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:15,231 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:16,253 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:17,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:18,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:19,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:20,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:21,351 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:22,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:23,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:24,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:25,431 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:26,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:27,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:28,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:29,512 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:30,533 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:31,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:32,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:33,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:34,613 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:35,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:36,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:37,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:38,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:39,711 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:40,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:41,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:42,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:43,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:44,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:45,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:46,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:47,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:48,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:49,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:50,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:51,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:52,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:53,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:55,010 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:56,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:57,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:58,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:12:59,091 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:00,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:01,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:02,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:03,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:04,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:05,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:06,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:07,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:08,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:09,291 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:10,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:11,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:12,353 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:13,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:14,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:15,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:16,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:17,451 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:18,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:19,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:20,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:21,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:22,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:23,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:24,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:25,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:26,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:27,653 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:28,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:29,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:30,713 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:31,731 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:32,752 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:33,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:34,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:35,812 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:36,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:37,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:38,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:39,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:40,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:41,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:42,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:43,973 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:44,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:46,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:47,033 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:48,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:49,071 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:50,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:51,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:52,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:53,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:54,190 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:55,213 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:56,231 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:57,253 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:58,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:13:59,293 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:00,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:01,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:02,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:03,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:04,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:05,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:06,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:07,451 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:08,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:09,492 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:10,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:11,531 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:12,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:13,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:14,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:15,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:16,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:17,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:18,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:19,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:20,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:21,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:22,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:23,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:24,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:25,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:26,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:27,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:28,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:29,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:30,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:31,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:32,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:33,971 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:34,991 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:36,011 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:37,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:38,051 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:39,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:40,091 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:41,111 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:42,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:43,151 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:44,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:45,192 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:46,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:47,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:48,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:49,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:50,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:51,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:52,332 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:53,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:54,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:55,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:56,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:57,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:58,452 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:14:59,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:00,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:01,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:02,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:03,551 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:04,571 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:05,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:06,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:07,631 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:08,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:09,674 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:10,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:11,712 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:12,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:13,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:14,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:15,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:16,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:17,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:18,852 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:19,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:20,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:21,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:22,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:23,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:24,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:25,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:27,013 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:28,031 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:29,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:30,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:31,092 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:32,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:33,131 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:34,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:35,171 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:36,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:37,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:38,232 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:39,252 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:40,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:41,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:42,313 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:43,331 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:44,351 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:45,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:46,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:47,412 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:48,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:49,451 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:50,472 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:51,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:52,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:53,531 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:54,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:55,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:56,591 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:57,611 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:58,632 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:15:59,652 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:00,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:01,692 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:02,711 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:03,730 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:04,751 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:05,772 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:06,792 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:07,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:08,832 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:09,851 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:09,998 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart 2025-09-06T01:16:10,872 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:11,892 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:12,911 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:13,932 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:14,952 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:15,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:16,992 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:18,012 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:19,032 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:20,052 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:21,072 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:22,091 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:23,112 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:24,132 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:25,152 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:26,172 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:27,191 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:28,211 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:29,231 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:30,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:31,272 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:32,292 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:33,312 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:34,331 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:35,352 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:36,372 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:37,392 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:38,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:39,432 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:40,451 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:41,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:42,491 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:43,511 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:44,531 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:45,552 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:46,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:47,592 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:48,612 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:49,631 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:50,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:54,484 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:16:54,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-27 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 3747 millis 2025-09-06T01:16:59,447 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 4268 millis 2025-09-06T01:16:59,447 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:00,462 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:01,481 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:02,501 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:03,522 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:04,542 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:05,561 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:06,581 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:07,602 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:08,622 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:09,642 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:10,662 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:11,681 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:12,702 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:13,722 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:14,741 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:15,762 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:16,781 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:17,801 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:18,822 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:19,842 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:20,861 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:21,882 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:22,902 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:23,922 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:24,941 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:25,962 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:26,982 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:28,002 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:29,022 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:30,041 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:31,062 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:32,081 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:33,101 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:34,122 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:35,141 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:36,161 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:37,196 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:38,212 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:39,231 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:40,251 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:41,271 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:42,291 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:43,311 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:44,331 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:45,351 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:46,371 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:47,391 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:48,411 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:49,430 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:50,451 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:51,359 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart 2025-09-06T01:17:51,471 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:51,722 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node 2025-09-06T01:17:52,068 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node 2025-09-06T01:17:52,487 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart 2025-09-06T01:17:52,490 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:53,510 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:54,532 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:55,177 | INFO | sshd-SshServer[4c5affee](port=8101)-nio2-thread-1 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Session karaf@/10.30.170.65:59512 authenticated 2025-09-06T01:17:55,550 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:55,645 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot 2025-09-06T01:17:56,028 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables 2025-09-06T01:17:56,572 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:57,591 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:58,611 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:17:59,631 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:00,651 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:01,671 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:02,691 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:03,606 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart 2025-09-06T01:18:03,711 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:04,732 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:05,753 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:06,771 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:07,791 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:08,811 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:09,831 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:10,851 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:11,871 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:12,891 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:13,912 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:14,931 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:15,432 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1 2025-09-06T01:18:15,775 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower 2025-09-06T01:18:15,951 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:16,166 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster 2025-09-06T01:18:16,556 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart 2025-09-06T01:18:16,890 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes 2025-09-06T01:18:16,972 | WARN | ForkJoinPool.commonPool-worker-2 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#-1754436292] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:518) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:333) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-09-06T01:18:17,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset Sep 06, 2025 1:18:54 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Sep 06, 2025 1:18:54 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Sep 06, 2025 1:18:54 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-09-06T01:18:55,206 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.logging]) | EventAdminConfigurationNotifier | 4 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.2.8 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-09-06T01:18:55,273 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-09-06T01:18:55,302 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-09-06T01:18:55,412 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-09-06T01:18:55,421 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a] for service with service.id [15] 2025-09-06T01:18:55,422 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a] for service with service.id [40] 2025-09-06T01:18:55,440 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-09-06T01:18:55,443 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-09-06T01:18:55,569 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.7 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-09-06T01:18:55,673 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,674 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,675 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,675 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,675 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,676 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,676 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@3ec952a8 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=ff11de6f-1642-4e2e-a4e7-b52d606f885a 2025-09-06T01:18:55,696 | INFO | activator-1-thread-2 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.7 | Deployment finished. Registering FeatureDeploymentListener 2025-09-06T01:18:55,701 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.7 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-09-06T01:18:55,782 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.7 2025-09-06T01:18:55,790 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.config.command/4.4.7 2025-09-06T01:18:55,851 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.7 2025-09-06T01:18:55,852 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.7 2025-09-06T01:18:55,863 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.features.command/4.4.7 2025-09-06T01:18:55,866 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.7. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-09-06T01:18:55,872 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.instance.core/4.4.7 2025-09-06T01:18:55,881 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T01:18:55,882 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T01:18:55,882 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.7 2025-09-06T01:18:55,885 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.kar.core/4.4.7 2025-09-06T01:18:55,888 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.log.core/4.4.7 2025-09-06T01:18:55,889 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.package.core/4.4.7 2025-09-06T01:18:55,891 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.service.core/4.4.7 2025-09-06T01:18:55,899 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.7 2025-09-06T01:18:55,899 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.7 2025-09-06T01:18:55,902 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Activator | 120 - org.apache.karaf.shell.core - 4.4.7 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-09-06T01:18:55,954 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.7 has been started 2025-09-06T01:18:56,006 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.7. Missing service: [org.apache.sshd.server.SshServer] 2025-09-06T01:18:56,051 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.system.core/4.4.7 2025-09-06T01:18:56,085 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.7. Missing service: [org.apache.karaf.web.WebContainerService] 2025-09-06T01:18:56,152 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.30 | Configuring WAR extender thread pool. Pool size = 3 2025-09-06T01:18:56,228 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.7 2025-09-06T01:18:56,245 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.14.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-09-06T01:18:56,274 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.30 | Starting Pax Web Whiteboard Extender 2025-09-06T01:18:56,305 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @3190ms to org.eclipse.jetty.util.log.Slf4jLog 2025-09-06T01:18:56,323 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Scheduling Pax Web reconfiguration because configuration has changed 2025-09-06T01:18:56,323 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-09-06T01:18:56,324 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Pax Web Runtime started 2025-09-06T01:18:56,325 | INFO | paxweb-config-3-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-09-06T01:18:56,358 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-09-06T01:18:56,361 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Starting BlueprintBundleTracker 2025-09-06T01:18:56,368 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Configuring JettyServerController{configuration=4eb5a593-9819-4deb-91f4-16acdc17a52d,state=UNCONFIGURED} 2025-09-06T01:18:56,369 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating Jetty server instance using configuration properties. 2025-09-06T01:18:56,369 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-09-06T01:18:56,370 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-09-06T01:18:56,370 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.7 [120] was successfully created 2025-09-06T01:18:56,391 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-09-06T01:18:56,508 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.7 2025-09-06T01:18:56,535 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-09-06T01:18:56,536 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Using configured jetty-default@b9795a4{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-09-06T01:18:56,537 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp745298161]@2c6c58f1{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-09-06T01:18:56,557 | INFO | paxweb-config-3-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding JMX support to Jetty server 2025-09-06T01:18:56,603 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-09-06T01:18:56,604 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting JettyServerController{configuration=4eb5a593-9819-4deb-91f4-16acdc17a52d,state=STOPPED} 2025-09-06T01:18:56,604 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Server@69a494e6{STOPPED}[9.4.57.v20241219] 2025-09-06T01:18:56,605 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.5+11-Ubuntu-1ubuntu122.04 2025-09-06T01:18:56,615 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-09-06T01:18:56,616 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-09-06T01:18:56,617 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 660000ms 2025-09-06T01:18:56,637 | INFO | paxweb-config-3-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@b9795a4{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-09-06T01:18:56,638 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @3531ms 2025-09-06T01:18:56,639 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering HttpService factory 2025-09-06T01:18:56,641 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.7 [105]] 2025-09-06T01:18:56,650 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-09-06T01:18:56,651 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.7 [124]] 2025-09-06T01:18:56,655 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.http.core/4.4.7 2025-09-06T01:18:56,662 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.apache.karaf.web.core/4.4.7 2025-09-06T01:18:56,665 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.30 [393]] 2025-09-06T01:18:56,663 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-09-06T01:18:56,668 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.30 [392]] 2025-09-06T01:18:56,677 | INFO | paxweb-config-3-thread-1 (change controller) | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-09-06T01:18:56,678 | INFO | paxweb-config-3-thread-1 (change controller) | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@34811ab3,contexts=[{HS,OCM-5,context:273638682,/}]} 2025-09-06T01:18:56,679 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@34811ab3,contexts=null}", size=4} 2025-09-06T01:18:56,679 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-09-06T01:18:56,682 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=4, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-09-06T01:18:56,703 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{HS,id=OCM-5,name='context:273638682',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:273638682',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@104f651a}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@35f90473{/,null,STOPPED} 2025-09-06T01:18:56,705 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@35f90473{/,null,STOPPED} 2025-09-06T01:18:56,706 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@34811ab3,contexts=[{HS,OCM-5,context:273638682,/}]} 2025-09-06T01:18:56,708 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.aaa.api.AuthenticationService), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:18:56,709 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:273638682',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:273638682',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@104f651a}} 2025-09-06T01:18:56,726 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.0 [172]] 2025-09-06T01:18:56,727 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BundleWhiteboardApplication | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.30 | No matching target context(s) for Whiteboard element ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[]}. Filter: (osgi.http.whiteboard.context.name=default). Element may be re-registered later, when matching context/s is/are registered. 2025-09-06T01:18:56,730 | INFO | paxweb-config-3-thread-1 (change controller) | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-09-06T01:18:56,739 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:18:56,747 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:18:56,766 | INFO | paxweb-config-3-thread-1 (change controller) | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@35f90473{/,null,AVAILABLE} 2025-09-06T01:18:56,767 | INFO | paxweb-config-3-thread-1 (change controller) | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:273638682',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:273638682',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@104f651a}}} as OSGi service for "/" context path 2025-09-06T01:18:56,769 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering HttpServiceRuntime 2025-09-06T01:18:56,772 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=1} 2025-09-06T01:18:56,773 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@35f90473{/,null,AVAILABLE} 2025-09-06T01:18:56,775 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-09-06T01:18:56,776 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-09-06T01:18:56,776 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-09-06T01:18:56,797 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | File-based Pekko configuration reader enabled 2025-09-06T01:18:56,825 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Actor System provider starting 2025-09-06T01:18:56,995 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating new ActorSystem 2025-09-06T01:18:57,248 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Slf4jLogger started 2025-09-06T01:18:57,471 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.226:2550] with UID [-3578709907272415621] 2025-09-06T01:18:57,480 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Starting up, Pekko version [1.0.3] ... 2025-09-06T01:18:57,527 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-09-06T01:18:57,528 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Started up successfully 2025-09-06T01:18:57,559 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.226:2550#-3578709907272415621], selfDc [default]. 2025-09-06T01:18:57,763 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Actor System provider started 2025-09-06T01:18:57,769 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Shard configuration provider started 2025-09-06T01:18:57,787 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.4. Missing service: [org.opendaylight.infrautils.diagstatus.DiagStatusServiceMBean] 2025-09-06T01:18:57,846 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1645934710]], but this node is not initialized yet 2025-09-06T01:18:57,847 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-2093592327]], but this node is not initialized yet 2025-09-06T01:18:57,910 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | ThreadFactory created: SystemReadyService 2025-09-06T01:18:57,912 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-09-06T01:18:57,913 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.4 | Diagnostic Status Service started 2025-09-06T01:18:57,913 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | checkBundleDiagInfos() started... 2025-09-06T01:18:57,917 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.4 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-09-06T01:18:57,917 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.4 | Diagnostic Status Service management started 2025-09-06T01:18:57,918 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.4 2025-09-06T01:18:57,932 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.mastership.MastershipChangeServiceManager), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:57,940 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:18:57,946 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.0. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-09-06T01:18:57,951 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService)] 2025-09-06T01:18:57,993 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2025-09-06T01:18:58,000 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:18:58,000 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:58,001 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:58,005 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.0 | ReconciliationManager started 2025-09-06T01:18:58,005 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.0 2025-09-06T01:18:58,006 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:58,010 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-09-06T01:18:58,012 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.0 2025-09-06T01:18:58,032 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:58,033 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService)] 2025-09-06T01:18:58,035 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | Registering openflowplugin service recovery handlers 2025-09-06T01:18:58,039 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.0. Missing service: [org.opendaylight.mdsal.binding.api.DataBroker, org.opendaylight.serviceutils.srm.spi.RegistryControl] 2025-09-06T01:18:58,043 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.14 | Binding/DOM Codec enabled 2025-09-06T01:18:58,049 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec activating 2025-09-06T01:18:58,050 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec activated 2025-09-06T01:18:58,055 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.14 | Binding/YANG type support activated 2025-09-06T01:18:58,062 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Binding Runtime activating 2025-09-06T01:18:58,063 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Binding Runtime activated 2025-09-06T01:18:58,120 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Model Runtime starting 2025-09-06T01:18:58,139 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Will attempt to integrate with Karaf FeaturesService 2025-09-06T01:18:58,556 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.0 | Netty transport backed by epoll(2) 2025-09-06T01:18:58,767 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.14 | Using weak references 2025-09-06T01:19:00,872 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | EffectiveModelContext generation 1 activated 2025-09-06T01:19:00,873 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.13 | DOM Schema services activated 2025-09-06T01:19:00,873 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.13 | Updating context to generation 1 2025-09-06T01:19:00,877 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.13 | DOM RPC/Action router started 2025-09-06T01:19:00,884 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.0 | Remote Operations service starting 2025-09-06T01:19:00,886 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.0 | Remote Operations service started 2025-09-06T01:19:01,001 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-34 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.0 | Initialized with root directory segmented-journal with storage MAPPED 2025-09-06T01:19:01,737 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | BindingRuntimeContext generation 1 activated 2025-09-06T01:19:01,754 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Binding/DOM Codec generation 1 activated 2025-09-06T01:19:01,755 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.14 | Global Binding/DOM Codec activated with generation 1 2025-09-06T01:19:01,760 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore Context Introspector activated 2025-09-06T01:19:01,763 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type CONFIGURATION starting 2025-09-06T01:19:01,976 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Create data store instance of type : config 2025-09-06T01:19:01,977 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T01:19:01,978 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T01:19:01,984 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating ShardManager : shardmanager-config 2025-09-06T01:19:02,006 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Starting ShardManager shard-manager-config 2025-09-06T01:19:02,012 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Recovery complete 2025-09-06T01:19:02,125 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Data store config is using tell-based protocol 2025-09-06T01:19:02,130 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T01:19:02,131 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Config file exists - reading config from it 2025-09-06T01:19:02,136 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type OPERATIONAL starting 2025-09-06T01:19:02,137 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Create data store instance of type : operational 2025-09-06T01:19:02,137 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Creating ShardManager : shardmanager-operational 2025-09-06T01:19:02,143 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-topology-config: Shard created, persistent : true 2025-09-06T01:19:02,144 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-config: Shard created, persistent : true 2025-09-06T01:19:02,143 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Starting ShardManager shard-manager-operational 2025-09-06T01:19:02,147 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Shard created, persistent : true 2025-09-06T01:19:02,148 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Recovery complete 2025-09-06T01:19:02,149 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-toaster-config: Shard created, persistent : true 2025-09-06T01:19:02,149 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Data store operational is using tell-based protocol 2025-09-06T01:19:02,151 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: Shard created, persistent : false 2025-09-06T01:19:02,157 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-topology-operational: Shard created, persistent : false 2025-09-06T01:19:02,158 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-operational: Shard created, persistent : false 2025-09-06T01:19:02,158 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-toaster-operational: Shard created, persistent : false 2025-09-06T01:19:02,161 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter activated 2025-09-06T01:19:02,169 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for MountPointService activated 2025-09-06T01:19:02,173 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.13 | DOM Notification Router started 2025-09-06T01:19:02,179 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService)] 2025-09-06T01:19:02,180 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for NotificationService activated 2025-09-06T01:19:02,183 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-default-config/member-3-shard-default-config-notifier#1726764930 created and ready for shard:member-3-shard-default-config 2025-09-06T01:19:02,183 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-topology-config/member-3-shard-topology-config-notifier#375639168 created and ready for shard:member-3-shard-topology-config 2025-09-06T01:19:02,184 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-topology-operational/member-3-shard-topology-operational-notifier#-1528120096 created and ready for shard:member-3-shard-topology-operational 2025-09-06T01:19:02,184 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-default-operational/member-3-shard-default-operational-notifier#-606150369 created and ready for shard:member-3-shard-default-operational 2025-09-06T01:19:02,185 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Starting recovery with journal batch size 1 2025-09-06T01:19:02,186 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Starting recovery with journal batch size 1 2025-09-06T01:19:02,187 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Starting recovery with journal batch size 1 2025-09-06T01:19:02,187 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-inventory-config/member-3-shard-inventory-config-notifier#1481201494 created and ready for shard:member-3-shard-inventory-config 2025-09-06T01:19:02,187 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Starting recovery with journal batch size 1 2025-09-06T01:19:02,188 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-toaster-config/member-3-shard-toaster-config-notifier#1696962738 created and ready for shard:member-3-shard-toaster-config 2025-09-06T01:19:02,188 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Starting recovery with journal batch size 1 2025-09-06T01:19:02,188 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-toaster-operational/member-3-shard-toaster-operational-notifier#1894050320 created and ready for shard:member-3-shard-toaster-operational 2025-09-06T01:19:02,189 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Starting recovery with journal batch size 1 2025-09-06T01:19:02,189 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-inventory-operational/member-3-shard-inventory-operational-notifier#-1270782773 created and ready for shard:member-3-shard-inventory-operational 2025-09-06T01:19:02,189 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Starting recovery with journal batch size 1 2025-09-06T01:19:02,190 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Starting recovery with journal batch size 1 2025-09-06T01:19:02,191 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-46 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.0 | Initialized with root directory segmented-journal with storage DISK 2025-09-06T01:19:02,194 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-09-06T01:19:02,196 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for NotificationPublishService activated 2025-09-06T01:19:02,198 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:19:02,198 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:19:02,198 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for RpcService activated 2025-09-06T01:19:02,201 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-09-06T01:19:02,207 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for RpcProviderService activated 2025-09-06T01:19:02,242 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: journal open: applyTo=0 2025-09-06T01:19:02,242 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: journal open: applyTo=0 2025-09-06T01:19:02,242 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: journal open: applyTo=4 2025-09-06T01:19:02,242 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: journal open: applyTo=0 2025-09-06T01:19:02,246 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: journal open: applyTo=0 2025-09-06T01:19:02,246 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: journal open: applyTo=0 2025-09-06T01:19:02,247 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: journal open: applyTo=75 2025-09-06T01:19:02,249 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: journal open: applyTo=0 2025-09-06T01:19:02,296 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,297 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,297 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,296 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,303 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,304 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:19:02,305 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:19:02,306 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,306 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for ActionService activated 2025-09-06T01:19:02,307 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,307 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from null to Follower 2025-09-06T01:19:02,307 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , received role change from null to Follower 2025-09-06T01:19:02,307 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , received role change from null to Follower 2025-09-06T01:19:02,308 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-09-06T01:19:02,308 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for ActionProviderService activated 2025-09-06T01:19:02,309 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | 8 DOMService trackers started 2025-09-06T01:19:02,311 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , received role change from null to Follower 2025-09-06T01:19:02,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T01:19:02,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T01:19:02,312 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-09-06T01:19:02,312 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-09-06T01:19:02,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T01:19:02,313 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-09-06T01:19:02,313 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from null to Follower 2025-09-06T01:19:02,313 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-toaster-operational from null to Follower 2025-09-06T01:19:02,314 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from null to Follower 2025-09-06T01:19:02,314 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-topology-operational from null to Follower 2025-09-06T01:19:02,314 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-inventory-operational from null to Follower 2025-09-06T01:19:02,314 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T01:19:02,315 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from null to Follower 2025-09-06T01:19:02,315 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from null to Follower 2025-09-06T01:19:02,315 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T01:19:02,316 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from null to Follower 2025-09-06T01:19:02,316 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Global BindingRuntimeContext generation 1 activated 2025-09-06T01:19:02,317 | INFO | Start Level: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.14 | Model Runtime started 2025-09-06T01:19:02,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from null to Follower 2025-09-06T01:19:02,325 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T01:19:02,325 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from null to Follower 2025-09-06T01:19:02,338 | INFO | Framework Event Dispatcher: Equinox Container: ff11de6f-1642-4e2e-a4e7-b52d606f885a | Main | 3 - org.ops4j.pax.logging.pax-logging-api - 2.2.8 | Karaf started in 8s. Bundle stats: 399 active, 400 total 2025-09-06T01:19:02,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from null to Follower 2025-09-06T01:19:02,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-09-06T01:19:02,376 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from null to Follower 2025-09-06T01:19:07,277 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.0 | member-3 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#-1542028956]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-09-06T01:19:09,871 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-2093592327]], but this node is not initialized yet 2025-09-06T01:19:09,927 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinAck message from [Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/system/cluster/core/daemon#-54335761]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2025-09-06T01:19:09,991 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Welcome from [pekko://opendaylight-cluster-data@10.30.171.195:2550] 2025-09-06T01:19:09,998 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.195:2550 2025-09-06T01:19:09,998 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-default-config 2025-09-06T01:19:09,998 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-topology-config 2025-09-06T01:19:09,998 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-default-config 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.171.195:2550 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-topology-config 2025-09-06T01:19:09,999 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-09-06T01:19:10,001 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-09-06T01:19:10,000 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-09-06T01:19:10,002 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:19:10,002 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#58666379] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:19:10,044 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Singleton identified at [pekko://opendaylight-cluster-data@10.30.171.195:2550/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-09-06T01:19:10,046 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:19:10,046 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#58666379] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:19:10,080 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$a] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$a] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:19:10,465 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2025-09-06T01:19:10,466 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-09-06T01:19:10,467 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-09-06T01:19:10,467 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-09-06T01:19:10,467 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-09-06T01:19:10,467 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-09-06T01:19:10,468 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2025-09-06T01:19:10,468 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:19:10,468 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-default-config 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-topology-config 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:19:10,469 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:19:10,471 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:19:10,471 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:19:10,471 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:19:10,471 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:19:10,470 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:19:10,471 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:19:10,472 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:19:10,472 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:19:10,473 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:19:10,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | ClusterSingletonManager state change [Start -> Younger] 2025-09-06T01:19:10,788 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-09-06T01:19:12,175 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,175 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,179 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,202 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,212 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@162efdc1 2025-09-06T01:19:12,213 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@253b1237 2025-09-06T01:19:12,213 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-topology-config status sync done false 2025-09-06T01:19:12,214 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done false 2025-09-06T01:19:12,214 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5c06cc8a 2025-09-06T01:19:12,214 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done false 2025-09-06T01:19:12,215 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@173ea073 2025-09-06T01:19:12,216 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-default-operational status sync done false 2025-09-06T01:19:12,228 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,237 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,241 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-inventory-config, lastLogIndex=4, lastLogTerm=2}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,247 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2e9f785a 2025-09-06T01:19:12,248 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-09-06T01:19:12,249 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-topology-operational status sync done false 2025-09-06T01:19:12,251 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3c9c1272 2025-09-06T01:19:12,250 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-2-shard-default-config, lastLogIndex=74, lastLogTerm=2}" message is greater than follower's term 2 - updating term 2025-09-06T01:19:12,252 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@52972d2b 2025-09-06T01:19:12,252 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-toaster-config status sync done false 2025-09-06T01:19:12,252 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-inventory-config status sync done false 2025-09-06T01:19:12,255 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore service type OPERATIONAL activated 2025-09-06T01:19:12,255 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type OPERATIONAL started 2025-09-06T01:19:12,262 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-default-config status sync done false 2025-09-06T01:19:12,262 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@52a28263 2025-09-06T01:19:12,262 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: All Shards are ready - data store config is ready 2025-09-06T01:19:12,264 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-inventory-config status sync done true 2025-09-06T01:19:12,264 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Datastore service type CONFIGURATION activated 2025-09-06T01:19:12,272 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-default-config status sync done true 2025-09-06T01:19:12,280 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.0 | Cluster Admin services started 2025-09-06T01:19:12,290 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.14 | ThreadFactory created: CommitFutures 2025-09-06T01:19:12,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | DOM Data Broker commit exector started 2025-09-06T01:19:12,293 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | DOM Data Broker started 2025-09-06T01:19:12,297 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T01:19:12,297 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding/DOM adapter for DataBroker activated 2025-09-06T01:19:12,350 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-09-06T01:19:12,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T01:19:12,357 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 is waiting for dependencies [Initial app config AaaCertServiceConfig] 2025-09-06T01:19:12,366 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 15.11 ms 2025-09-06T01:19:12,378 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | Listening for password service configuration 2025-09-06T01:19:12,379 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:19:12,400 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:19:12,402 | ERROR | opendaylight-cluster-data-notification-dispatcher-48 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | bundle org.opendaylight.aaa.idm-store-h2:0.21.0 (167)[org.opendaylight.aaa.datastore.h2.H2Store(5)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-09-06T01:19:12,403 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.13 | Cluster Singleton Service started 2025-09-06T01:19:12,404 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will utilize default iteration count=20000 2025-09-06T01:19:12,404 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-09-06T01:19:12,405 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.0 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-09-06T01:19:12,416 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.0 | ietf-yang-library writer registered 2025-09-06T01:19:12,419 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.0 | H2 IDMStore activated 2025-09-06T01:19:12,430 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-09-06T01:19:12,432 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, Initial app config DatastoreConfig] 2025-09-06T01:19:12,454 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-09-06T01:19:12,460 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-09-06T01:19:12,463 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.0 | ArbitratorReconciliationManager has started successfully. 2025-09-06T01:19:12,477 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-09-06T01:19:12,478 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T01:19:12,485 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 6.893 ms 2025-09-06T01:19:12,493 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T01:19:12,512 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-09-06T01:19:12,513 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T01:19:12,530 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-default-operational status sync done true 2025-09-06T01:19:12,527 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.0 | AAAEncryptionService activated 2025-09-06T01:19:12,531 | INFO | Blueprint Extender: 2 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.0 | AaaCertMdsalProvider Initialized 2025-09-06T01:19:12,533 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.0 | Encryption Service enabled 2025-09-06T01:19:12,542 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | DeviceOwnershipService started 2025-09-06T01:19:12,601 | INFO | Blueprint Extender: 3 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.0 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-09-06T01:19:12,603 | INFO | Blueprint Extender: 2 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.14 | Using lazy population for lists larger than 16 element(s) 2025-09-06T01:19:12,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.0 | DefaultConfigPusher has started. 2025-09-06T01:19:12,680 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), Initial app config TopologyLldpDiscoveryConfig] 2025-09-06T01:19:12,681 | INFO | Blueprint Extender: 3 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.0 | NodeConnectorInventoryEventTranslator has started. 2025-09-06T01:19:12,683 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.0 has been started 2025-09-06T01:19:12,691 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.0 [300] was successfully created 2025-09-06T01:19:12,698 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-09-06T01:19:12,700 | INFO | Blueprint Extender: 2 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.0 | Certificate Manager service has been initialized 2025-09-06T01:19:12,707 | INFO | Blueprint Extender: 2 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.0 | AaaCert Rpc Service has been initialized 2025-09-06T01:19:12,709 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.0 has been started 2025-09-06T01:19:12,716 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.0 [163] was successfully created 2025-09-06T01:19:12,717 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.0 | Topology Manager service started. 2025-09-06T01:19:12,731 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Checking if default entries must be created in IDM store 2025-09-06T01:19:12,760 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-topology-operational status sync done true 2025-09-06T01:19:12,766 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-toaster-config status sync done true 2025-09-06T01:19:12,784 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | MD-SAL configuration-based SwitchConnectionProviders started 2025-09-06T01:19:12,786 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-09-06T01:19:12,787 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-09-06T01:19:12,793 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | rpc-requests-quota configuration property was changed to '20000' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | global-notification-quota configuration property was changed to '64000' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | switch-features-mandatory configuration property was changed to 'false' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | enable-flow-removed-notification configuration property was changed to 'true' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | barrier-count-limit configuration property was changed to '25600' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | echo-reply-timeout configuration property was changed to '2000' 2025-09-06T01:19:12,794 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | skip-table-features configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | basic-timer-delay configuration property was changed to '3000' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | maximum-timer-delay configuration property was changed to '900000' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | use-single-layer-serialization configuration property was changed to 'true' 2025-09-06T01:19:12,795 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-min-threads configuration property was changed to '1' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-max-threads configuration property was changed to '32000' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | thread-pool-timeout configuration property was changed to '60' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | device-datastore-removal-delay configuration property was changed to '500' 2025-09-06T01:19:12,796 | INFO | Blueprint Extender: 1 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-09-06T01:19:12,798 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-09-06T01:19:12,798 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-09-06T01:19:12,817 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.7 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.0 2025-09-06T01:19:12,860 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@ac74324 was registered as configuration listener to OpenFlowPlugin configuration service 2025-09-06T01:19:12,872 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-09-06T01:19:12,877 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-09-06T01:19:12,881 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Distributed Datastore type CONFIGURATION started 2025-09-06T01:19:12,885 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2025-09-06T01:19:12,886 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2025-09-06T01:19:13,046 | INFO | Blueprint Extender: 2 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.0 | ForwardingRulesManager has started successfully. 2025-09-06T01:19:13,048 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.0 has been started 2025-09-06T01:19:13,049 | INFO | opendaylight-cluster-data-notification-dispatcher-48 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.0 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-09-06T01:19:13,053 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.0 [299] was successfully created 2025-09-06T01:19:13,069 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational#1438006928], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2025-09-06T01:19:13,070 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational#1438006928], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2025-09-06T01:19:13,070 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-topology-operational#1438006928], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 551.4 μs 2025-09-06T01:19:13,088 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@4433f763 was registered as configuration listener to OpenFlowPlugin configuration service 2025-09-06T01:19:13,138 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.0 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-09-06T01:19:13,142 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.0 | Found default domain in IDM store, skipping insertion of default data 2025-09-06T01:19:13,143 | INFO | Blueprint Extender: 3 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.0 | AAAShiroProvider Session Initiated 2025-09-06T01:19:13,145 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.0 | LLDPDiscoveryListener started. 2025-09-06T01:19:13,146 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.0 has been started 2025-09-06T01:19:13,158 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.0 [303] was successfully created 2025-09-06T01:19:13,231 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.0 | Topology node flow:1 is successfully written to the operational datastore. 2025-09-06T01:19:13,244 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config Received follower initial sync status for member-3-shard-topology-config status sync done true 2025-09-06T01:19:13,244 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done true 2025-09-06T01:19:13,245 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done true 2025-09-06T01:19:13,264 | INFO | Blueprint Extender: 3 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.0 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-09-06T01:19:13,270 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-09-06T01:19:13,271 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-09-06T01:19:13,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 731.6 μs 2025-09-06T01:19:13,297 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-09-06T01:19:13,298 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-09-06T01:19:13,298 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-09-06T01:19:13,298 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@6ea267ba{/auth,null,STOPPED} 2025-09-06T01:19:13,299 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@6ea267ba{/auth,null,STOPPED} 2025-09-06T01:19:13,301 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T01:19:13,302 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-09-06T01:19:13,303 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T01:19:13,304 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-09-06T01:19:13,303 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.aaa.shiro_0.21.0 [172] registered context path /auth with 4 service(s) 2025-09-06T01:19:13,309 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Initializing CustomFilterAdapter 2025-09-06T01:19:13,310 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Injecting a new filter chain with 0 Filters: 2025-09-06T01:19:13,310 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@6ea267ba{/auth,null,AVAILABLE} 2025-09-06T01:19:13,310 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-09-06T01:19:13,311 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T01:19:13,312 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T01:19:13,312 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-09-06T01:19:13,312 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T01:19:13,319 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T01:19:13,319 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T01:19:13,319 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2025-09-06T01:19:13,319 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-09-06T01:19:13,321 | INFO | Blueprint Extender: 1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-09-06T01:19:13,322 | INFO | Blueprint Extender: 1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@29d0a242 2025-09-06T01:19:13,322 | INFO | Blueprint Extender: 1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@322a9b1d 2025-09-06T01:19:13,353 | ERROR | Blueprint Extender: 3 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.0 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.0 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(69)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-09-06T01:19:13,363 | INFO | Blueprint Extender: 1 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.0 | ONF Extension Provider started. 2025-09-06T01:19:13,364 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.0 has been started 2025-09-06T01:19:13,365 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.0 [309] was successfully created 2025-09-06T01:19:13,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-09-06T01:19:13,382 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:19:13,420 | INFO | Blueprint Extender: 3 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278]] 2025-09-06T01:19:13,421 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-09-06T01:19:13,421 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-09-06T01:19:13,422 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-09-06T01:19:13,422 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@478665a{/rests,null,STOPPED} 2025-09-06T01:19:13,423 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278] registered context path /rests with 4 service(s) 2025-09-06T01:19:13,423 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@478665a{/rests,null,STOPPED} 2025-09-06T01:19:13,423 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T01:19:13,424 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-09-06T01:19:13,424 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T01:19:13,424 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T01:19:13,424 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-09-06T01:19:13,426 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Initializing CustomFilterAdapter 2025-09-06T01:19:13,426 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.0 | Injecting a new filter chain with 0 Filters: 2025-09-06T01:19:13,426 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@478665a{/rests,null,AVAILABLE} 2025-09-06T01:19:13,427 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-09-06T01:19:13,427 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T01:19:13,427 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T01:19:13,428 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2025-09-06T01:19:13,429 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-09-06T01:19:13,431 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Created new ServletContextModel{id=ServletContextModel-25,contextPath='/.well-known'} 2025-09-06T01:19:13,431 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-09-06T01:19:13,431 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-25,contextPath='/.well-known'} 2025-09-06T01:19:13,432 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@6e430a9f{/.well-known,null,STOPPED} 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@6e430a9f{/.well-known,null,STOPPED} 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=2} 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /auth 2025-09-06T01:19:13,433 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.0 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.0 [278] registered context path /.well-known with 3 service(s) 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /rests 2025-09-06T01:19:13,433 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context /.well-known 2025-09-06T01:19:13,434 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-09-06T01:19:13,434 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@6e430a9f{/.well-known,null,AVAILABLE} 2025-09-06T01:19:13,434 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.30 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-09-06T01:19:13,434 | INFO | Blueprint Extender: 3 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.0 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@7c88364c 2025-09-06T01:19:13,435 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Changing filter configuration for context / 2025-09-06T01:19:13,435 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.30 | Registering ServletModel{id=ServletModel-27,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-09-06T01:19:13,435 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Receiving Batch{"Registration of ServletModel{id=ServletModel-27,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=1} 2025-09-06T01:19:13,435 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.30 | Adding servlet ServletModel{id=ServletModel-27,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-09-06T01:19:13,468 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-09-06T01:19:13,469 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-09-06T01:19:13,469 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-09-06T01:19:13,470 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.14 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-09-06T01:19:13,495 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.0 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-09-06T01:19:13,495 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.0 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-09-06T01:19:13,531 | INFO | Blueprint Extender: 3 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.0 | Global RESTCONF northbound pools started 2025-09-06T01:19:13,533 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.0 has been started 2025-09-06T01:19:13,533 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.0 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.0 [172] was successfully created 2025-09-06T01:19:14,245 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.4 | checkBundleDiagInfos: Elapsed time 16s, remaining time 283s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=399, STOPPING=0, FAILURE=0} 2025-09-06T01:19:14,245 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-09-06T01:19:14,245 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | Now notifying all its registered SystemReadyListeners... 2025-09-06T01:19:14,245 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | onSystemBootReady() received, starting the switch connections 2025-09-06T01:19:14,341 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-09-06T01:19:14,342 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-09-06T01:19:14,343 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@29d0a242 started 2025-09-06T01:19:14,343 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-09-06T01:19:14,344 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.0 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-09-06T01:19:14,344 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@322a9b1d started 2025-09-06T01:19:14,344 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | All switchConnectionProviders are up and running (2). 2025-09-06T01:19:24,561 | INFO | qtp745298161-397 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.0 | Authentication is now enabled 2025-09-06T01:19:24,562 | INFO | qtp745298161-397 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.0 | Authentication Manager activated 2025-09-06T01:19:26,340 | INFO | qtp745298161-395 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.0 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-09-06T01:19:26,342 | INFO | qtp745298161-395 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.0 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-09-06T01:19:26,564 | INFO | qtp745298161-395 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.0 | Consecutive slashes in REST URLs will be rejected 2025-09-06T01:19:30,769 | INFO | sshd-SshServer[55ec1453](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Session karaf@/10.30.170.65:45864 authenticated 2025-09-06T01:19:31,345 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart 2025-09-06T01:25:23,293 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1 2025-09-06T01:25:23,839 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart 2025-09-06T01:25:24,348 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1 2025-09-06T01:25:24,788 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1 2025-09-06T01:25:25,199 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster 2025-09-06T01:25:25,638 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart 2025-09-06T01:25:26,782 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader 2025-09-06T01:25:29,965 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader 2025-09-06T01:25:30,201 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:25:30,274 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:25:30,434 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:25:30,457 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart 2025-09-06T01:25:30,759 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T01:25:30,884 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart 2025-09-06T01:27:11,666 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node 2025-09-06T01:27:11,833 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL2 10.30.171.248" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Killing ODL2 10.30.171.248 2025-09-06T01:27:16,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)]. 2025-09-06T01:27:16,491 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:16,491 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:16,493 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T01:27:16,493 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-09-06T01:27:16,495 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T01:27:23.494842333Z. 2025-09-06T01:27:16,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: refreshing backend for shard 0 2025-09-06T01:27:16,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 0 2025-09-06T01:27:16,497 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-09-06T01:27:16,497 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: refreshing backend for shard 2 2025-09-06T01:27:20,146 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:27:20,147 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1832971553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:27:20,147 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1832971553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:27:20,148 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1832971553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:27:20,148 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:27:20,193 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:27:22,116 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.248:2550 is unreachable 2025-09-06T01:27:22,127 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Candidate): Starting new election term 4 2025-09-06T01:27:22,127 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-09-06T01:27:22,128 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4402b9eb 2025-09-06T01:27:22,128 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from Follower to Candidate 2025-09-06T01:27:22,128 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from Follower to Candidate 2025-09-06T01:27:22,143 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-1-shard-inventory-operational, lastLogIndex=354, lastLogTerm=3}" message is greater than follower's term 3 - updating term 2025-09-06T01:27:22,151 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-09-06T01:27:22,152 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-topology-config , received role change from Candidate to Leader 2025-09-06T01:27:22,152 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@452fa275 2025-09-06T01:27:22,152 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-topology-config from Candidate to Leader 2025-09-06T01:27:22,154 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.248:2550 is unreachable 2025-09-06T01:27:22,157 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@44d909a5 2025-09-06T01:27:22,157 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Candidate): Starting new election term 4 2025-09-06T01:27:22,157 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done false 2025-09-06T01:27:22,157 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-09-06T01:27:22,158 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7318a1c9 2025-09-06T01:27:22,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from Follower to Candidate 2025-09-06T01:27:22,159 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from Follower to Candidate 2025-09-06T01:27:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-inventory-operational status sync done true 2025-09-06T01:27:22,161 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-1-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2025-09-06T01:27:22,164 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.248:2550 is unreachable 2025-09-06T01:27:22,167 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational#2077500482], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2025-09-06T01:27:22,168 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational#2077500482], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2025-09-06T01:27:22,168 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1180830251], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/user/shardmanager-operational/member-1-shard-inventory-operational#2077500482], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 598.2 μs 2025-09-06T01:27:22,169 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate): Starting new election term 4 2025-09-06T01:27:22,169 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-09-06T01:27:22,169 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7ae69ae6 2025-09-06T01:27:22,169 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from Follower to Candidate 2025-09-06T01:27:22,170 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from Follower to Candidate 2025-09-06T01:27:22,171 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done false 2025-09-06T01:27:22,171 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7d4304e8 2025-09-06T01:27:22,172 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-09-06T01:27:22,172 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-toaster-config , received role change from Candidate to Leader 2025-09-06T01:27:22,173 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@57665082 2025-09-06T01:27:22,173 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-toaster-config from Candidate to Leader 2025-09-06T01:27:22,193 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.248:2550 is unreachable 2025-09-06T01:27:22,199 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate): Starting new election term 4 2025-09-06T01:27:22,199 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-09-06T01:27:22,200 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3cd36994 2025-09-06T01:27:22,200 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from Follower to Candidate 2025-09-06T01:27:22,201 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from Follower to Candidate 2025-09-06T01:27:22,304 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.248:2550 is unreachable 2025-09-06T01:27:22,308 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Candidate): Starting new election term 4 2025-09-06T01:27:22,308 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-09-06T01:27:22,309 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7beda610 2025-09-06T01:27:22,309 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from Follower to Candidate 2025-09-06T01:27:22,310 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from Follower to Candidate 2025-09-06T01:27:22,318 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-09-06T01:27:22,319 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@35123221 2025-09-06T01:27:22,321 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-config , received role change from Candidate to Leader 2025-09-06T01:27:22,322 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-default-config from Candidate to Leader 2025-09-06T01:27:22,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-09-06T01:27:22,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-09-06T01:27:22,328 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config#-1859237968], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.994 ms 2025-09-06T01:27:22,685 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational Received follower initial sync status for member-3-shard-toaster-operational status sync done true 2025-09-06T01:27:23,691 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.171.248:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.195:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (1), pekko://opendaylight-cluster-data@10.30.171.195:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (1)] 2025-09-06T01:27:23,692 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.248:2550,-4150491613831459604)] 2025-09-06T01:27:23,692 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] as [Down] 2025-09-06T01:27:23,694 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T01:27:30.694227657Z. 2025-09-06T01:27:23,860 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:23,860 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:23,859 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T01:27:23,862 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Association to [pekko://opendaylight-cluster-data@10.30.171.248:2550] with UID [-4150491613831459604] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-09-06T01:27:25,779 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node 2025-09-06T01:27:27,875 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:27:28,346 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node 2025-09-06T01:27:28,495 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL2 10.30.171.248" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting ODL2 10.30.171.248 2025-09-06T01:27:30,873 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Member removed [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T01:27:32,237 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate): Starting new election term 5 2025-09-06T01:27:32,286 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate): Starting new election term 5 2025-09-06T01:27:35,173 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-584066242]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2025-09-06T01:27:35,174 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-584066242]] (version [1.0.3]) 2025-09-06T01:27:36,097 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.248:2550] to [Up] 2025-09-06T01:27:36,097 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:27:36,098 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:27:36,099 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:27:36,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-3-shard-default-operational currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:628) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:242) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-09-06T01:27:38,914 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:27:38,915 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:27:39,439 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T01:27:42,288 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate): Starting new election term 6 2025-09-06T01:27:42,300 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 6 2025-09-06T01:27:42,301 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@153c68bd 2025-09-06T01:27:42,303 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from Candidate to Leader 2025-09-06T01:27:42,304 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from Candidate to Leader 2025-09-06T01:27:42,304 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: All Shards are ready - data store config is ready 2025-09-06T01:27:42,348 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate): Starting new election term 6 2025-09-06T01:27:42,357 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 6 2025-09-06T01:27:42,357 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1556d460 2025-09-06T01:27:42,358 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-default-operational , received role change from Candidate to Leader 2025-09-06T01:27:42,358 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received role changed for member-3-shard-default-operational from Candidate to Leader 2025-09-06T01:27:42,358 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-09-06T01:27:42,359 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-3-shard-default-operational#-1082326140], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-09-06T01:27:42,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-3-shard-default-operational#-1082326140], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-09-06T01:27:42,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational#599434262], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-3-shard-default-operational#-1082326140], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 934.7 μs 2025-09-06T01:27:42,390 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 39, snapshotTerm: 3, replicatedToAllIndex: -1 2025-09-06T01:27:42,391 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-09-06T01:27:42,391 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): Initiating install snapshot to follower member-2-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 39, leader lastIndex: 40, leader log size: 1 2025-09-06T01:27:42,403 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=40, lastAppliedTerm=3, lastIndex=40, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-default-operational 2025-09-06T01:27:42,426 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Persising snapshot at EntryInfo[index=40, term=3]/EntryInfo[index=40, term=3] 2025-09-06T01:27:42,428 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 39 and term: 3 2025-09-06T01:27:42,442 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: snapshot is durable as of 2025-09-06T01:27:42.427791991Z 2025-09-06T01:27:42,476 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=2}, nanosAgo=20155373324, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=3} 2025-09-06T01:27:42,477 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): Snapshot successfully installed on follower member-2-shard-default-operational (last chunk 1) - matchIndex set to 40, nextIndex set to 41 2025-09-06T01:27:42,638 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, nanosAgo=280284583, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=2} 2025-09-06T01:27:43,585 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: Store Tx member-2-datastore-operational-fe-2-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-09-06T01:27:52,488 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart 2025-09-06T01:28:25,357 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.195:2550: 2137 millis 2025-09-06T01:33:44,304 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader 2025-09-06T01:33:47,521 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart 2025-09-06T01:33:47,785 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:33:47,904 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:33:47,966 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=2}, nanosAgo=365662524346, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=3} 2025-09-06T01:33:48,216 | INFO | opendaylight-cluster-data-notification-dispatcher-64 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T01:35:29,982 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart 2025-09-06T01:35:30,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:35:30,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:35:31,012 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T01:35:32,768 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node 2025-09-06T01:35:33,184 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart 2025-09-06T01:35:33,626 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart 2025-09-06T01:35:34,777 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2 2025-09-06T01:35:37,582 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2 2025-09-06T01:35:37,954 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:35:38,038 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart 2025-09-06T01:35:38,194 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:35:38,219 | INFO | opendaylight-cluster-data-notification-dispatcher-65 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T01:35:38,462 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart 2025-09-06T01:37:20,779 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2 2025-09-06T01:37:20,937 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL2 10.30.171.248" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Killing ODL2 10.30.171.248 2025-09-06T01:37:25,745 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)]. 2025-09-06T01:37:25,746 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:25,745 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:25,747 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T01:37:32.745947827Z. 2025-09-06T01:37:29,132 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:37:29,134 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1832971553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [107] dead letters encountered, of which 96 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,135 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,135 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,136 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1832971553] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,136 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/clusterReceptionist/replicator#-611827098] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,136 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-3-shard-default-operational#-1082326140] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,136 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-toaster-config#972328392] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,137 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-topology-config#-1103323626] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,137 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,137 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:29,138 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#58666379] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T01:37:32,825 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.171.248:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.195:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (2), pekko://opendaylight-cluster-data@10.30.171.195:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (2)] 2025-09-06T01:37:32,826 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.248:2550,-7148714470107626924)] 2025-09-06T01:37:32,826 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] as [Down] 2025-09-06T01:37:32,827 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T01:37:39.826706647Z. 2025-09-06T01:37:34,265 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T01:37:34,266 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:34,266 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:34,267 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Association to [pekko://opendaylight-cluster-data@10.30.171.248:2550] with UID [-7148714470107626924] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-09-06T01:37:35,021 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit 2025-09-06T01:37:36,469 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:37:37,619 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2 2025-09-06T01:37:37,756 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL2 10.30.171.248" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting ODL2 10.30.171.248 2025-09-06T01:37:37,961 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:37:39,002 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.248/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-09-06T01:37:41,284 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Member removed [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T01:37:44,123 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-478685345]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2025-09-06T01:37:44,123 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.248:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-478685345]] (version [1.0.3]) 2025-09-06T01:37:45,485 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.248:2550] to [Up] 2025-09-06T01:37:45,485 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:45,486 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:37:45,486 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T01:37:45,486 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-09-06T01:37:45,487 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-topology-config 2025-09-06T01:37:45,488 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:37:45,488 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-09-06T01:37:45,488 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: All Shards are ready - data store config is ready 2025-09-06T01:37:45,488 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-09-06T01:37:45,488 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.248:2550/user/shardmanager-config/member-2-shard-default-config 2025-09-06T01:37:48,774 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:37:48,775 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:37:48,782 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=6, success=true, followerId=member-2-shard-inventory-config, logLastIndex=16, logLastTerm=6, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27626, lastApplied : 16, commitIndex : 16 2025-09-06T01:37:48,861 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=6, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27536, lastApplied : 56, commitIndex : 56 2025-09-06T01:37:48,862 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 55, snapshotTerm: 6, replicatedToAllIndex: 55 2025-09-06T01:37:48,862 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-09-06T01:37:48,862 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): Initiating install snapshot to follower member-2-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 55, leader lastIndex: 56, leader log size: 1 2025-09-06T01:37:48,863 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=56, lastAppliedTerm=6, lastIndex=56, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-default-operational 2025-09-06T01:37:48,869 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Persising snapshot at EntryInfo[index=56, term=6]/EntryInfo[index=56, term=6] 2025-09-06T01:37:48,869 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 55 and term: 6 2025-09-06T01:37:48,877 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational: snapshot is durable as of 2025-09-06T01:37:48.869817156Z 2025-09-06T01:37:48,904 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27568, lastApplied : -1, commitIndex : -1 2025-09-06T01:37:48,904 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=6, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 55, snapshotTerm: 6, replicatedToAllIndex: 55 2025-09-06T01:37:48,904 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27568, lastApplied : -1, commitIndex : -1 2025-09-06T01:37:48,905 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-09-06T01:37:48,951 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-default-config, logLastIndex=162, logLastTerm=4, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27555, lastApplied : 162, commitIndex : 162 2025-09-06T01:37:48,971 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader): Snapshot successfully installed on follower member-2-shard-default-operational (last chunk 1) - matchIndex set to 56, nextIndex set to 57 2025-09-06T01:37:49,176 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=3}, nanosAgo=606064174542, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=4} 2025-09-06T01:37:49,280 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T01:37:49,364 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=2}, nanosAgo=135105257394, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=3} 2025-09-06T01:37:50,177 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-default-operational: Store Tx member-2-datastore-operational-fe-3-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-09-06T01:38:01,894 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart 2025-09-06T01:43:53,764 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2 2025-09-06T01:43:57,030 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart 2025-09-06T01:43:57,284 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:43:57,395 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=3}, nanosAgo=499097519428, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=4} 2025-09-06T01:43:57,404 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-09-06T01:43:57,660 | INFO | opendaylight-cluster-data-notification-dispatcher-75 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Clearing the device connection timer for the device 1 2025-09-06T01:45:39,422 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 2025-09-06T01:45:39,864 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.0 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:45:39,864 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-09-06T01:45:40,371 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.0 | Try to remove device openflow:1 from operational DS 2025-09-06T01:45:42,163 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2 2025-09-06T01:45:42,621 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart 2025-09-06T01:45:45,418 | INFO | sshd-SshServer[55ec1453](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Session karaf@/10.30.170.65:50130 authenticated 2025-09-06T01:45:46,066 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot 2025-09-06T01:45:46,460 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory 2025-09-06T01:45:51,535 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification 2025-09-06T01:45:51,947 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower 2025-09-06T01:45:52,150 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, nanosAgo=1089846780112, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1} 2025-09-06T01:45:53,158 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.lang.UnsupportedOperationException: null at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-09-06T01:45:53,166 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | null 2025-09-06T01:46:08,384 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15230 ms in state COMMIT_PENDING 2025-09-06T01:46:08,387 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:46:23,443 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:46:23,444 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:46:38,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-09-06T01:46:38,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:47:08,603 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16406 ms in state COMMIT_PENDING 2025-09-06T01:47:08,605 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:47:23,664 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:47:23,665 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:47:38,714 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:47:38,714 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:48:08,824 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16586 ms in state COMMIT_PENDING 2025-09-06T01:48:08,825 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:48:22,271 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:48:22,273 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:48:23,883 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:48:23,884 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:48:38,934 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:48:38,935 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:48:52,294 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:48:52,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:49:09,023 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16733 ms in state COMMIT_PENDING 2025-09-06T01:49:09,024 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:49:22,314 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:49:22,316 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:49:39,143 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16832 ms in state COMMIT_PENDING 2025-09-06T01:49:39,144 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:49:52,338 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:49:52,340 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:50:09,256 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 16920 ms in state COMMIT_PENDING 2025-09-06T01:50:09,261 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:50:39,354 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 17059 ms in state COMMIT_PENDING 2025-09-06T01:50:39,354 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:50:54,404 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:50:54,405 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:51:09,463 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:51:09,464 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:51:24,514 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:51:24,515 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:51:39,574 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:51:39,574 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:51:54,624 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:51:54,626 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:52:09,684 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:52:09,685 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:52:24,734 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:52:24,734 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:52:33,312 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster 2025-09-06T01:52:33,391 | INFO | qtp745298161-451 | StaticConfiguration | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.13 | Binding-over-DOM codec shortcuts are enabled 2025-09-06T01:52:33,438 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:33,438 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 1.656 ms 2025-09-06T01:52:33,442 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,443 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,444 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,444 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,445 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,445 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,445 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,446 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,447 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,447 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,448 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,448 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,448 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,449 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,449 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,449 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,450 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,450 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,451 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,451 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,451 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,452 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,452 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,453 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,453 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,453 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,454 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,454 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,456 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,457 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,457 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,462 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,462 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,463 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,463 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,464 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,464 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,464 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,465 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,465 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,466 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,466 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,466 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,467 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,467 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,467 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,468 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,469 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,469 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,470 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,470 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,472 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,473 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,473 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,473 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,474 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,474 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,474 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,474 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,478 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,478 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,479 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,479 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,479 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,480 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,480 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,481 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,481 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,483 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,483 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,483 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,484 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,484 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,484 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,485 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,485 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,485 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,486 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,486 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,488 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,489 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,489 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,489 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,489 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,490 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,493 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,493 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,493 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,494 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,494 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,502 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,503 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,503 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,503 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,505 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,505 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,513 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,514 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,515 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,524 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,526 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,528 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,528 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,529 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,529 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,530 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,530 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,531 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,531 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,532 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,533 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,533 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,534 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,535 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,535 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,536 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,537 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,537 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,538 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,539 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,544 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-311-1 sequence 0 (300), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 299 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:33,546 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,546 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:33,572 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:33,572 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,624 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,625 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,626 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,626 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,626 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,627 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,628 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,632 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,632 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,632 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,633 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,633 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,634 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,634 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,634 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,635 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,635 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,636 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,636 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,637 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,637 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,637 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,637 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,638 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,638 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,639 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,639 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,640 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,640 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,640 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,641 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,691 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 118.9 ms 2025-09-06T01:52:33,698 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-336-1 sequence 0 (325), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 324 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:33,699 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,699 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:33,701 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:33,701 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,703 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,704 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,704 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,704 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,906 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 205.5 ms 2025-09-06T01:52:33,907 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-15-1 sequence 0 (4), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 3 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:33,908 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,908 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:33,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:33,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,914 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,914 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,914 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,914 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,915 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,915 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,915 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,916 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,916 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,916 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,916 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,917 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,917 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,917 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,961 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 50.02 ms 2025-09-06T01:52:33,963 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-25-1 sequence 0 (14), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 13 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:33,964 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,964 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:33,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:33,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:33,972 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,973 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,973 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,973 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,973 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:33,974 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:33,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,001 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.48 ms 2025-09-06T01:52:34,002 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-38-1 sequence 0 (27), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 26 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,003 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,003 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,005 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,005 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,007 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,008 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,008 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,009 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,009 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,010 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,010 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,012 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,012 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,015 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,016 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,017 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,017 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,018 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,018 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,018 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,019 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,019 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,020 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,020 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,020 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,021 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,021 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,021 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,023 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,029 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.99 ms 2025-09-06T01:52:34,030 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-38-1 sequence 0 (27), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 26 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,031 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,031 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,044 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,044 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,047 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,048 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,071 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 27.13 ms 2025-09-06T01:52:34,072 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-16-1 sequence 0 (5), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 4 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,072 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,073 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,075 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,076 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,078 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,078 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,078 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,078 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,079 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,079 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,079 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,079 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,079 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,080 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,080 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,080 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,125 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 49.01 ms 2025-09-06T01:52:34,125 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-22-1 sequence 0 (11), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 10 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,126 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,126 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,129 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,129 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,130 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,131 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,132 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,134 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,134 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,135 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,136 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,136 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,137 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,137 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,137 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,137 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,138 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,138 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,138 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,139 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,139 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,139 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,140 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,140 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,140 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,140 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,141 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,141 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,141 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,142 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,142 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,142 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,143 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,143 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,143 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,144 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,144 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,145 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,145 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,148 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,148 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,148 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,149 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,157 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 28.58 ms 2025-09-06T01:52:34,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-172-1 sequence 0 (161), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 160 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,160 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,160 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,161 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,162 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,162 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,162 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,162 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,163 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,163 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,163 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,167 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,168 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,170 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,170 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,196 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 36.52 ms 2025-09-06T01:52:34,197 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-19-1 sequence 0 (8), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 7 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,198 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,198 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,199 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,200 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,202 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,203 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,203 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,204 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,253 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=11, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 53.13 ms 2025-09-06T01:52:34,254 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-17-1 sequence 0 (6), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 5 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,256 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,256 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,257 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,257 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,260 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,261 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,262 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,262 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,288 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=12, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 30.80 ms 2025-09-06T01:52:34,288 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,289 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,289 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,294 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,294 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,295 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,296 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,296 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,297 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,302 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,303 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,303 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,304 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,317 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=13, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.06 ms 2025-09-06T01:52:34,318 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-43-1 sequence 0 (32), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 31 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,318 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,318 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,320 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,320 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,321 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,322 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,322 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,323 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,323 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,323 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,324 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,325 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,325 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,325 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,326 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,326 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,327 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,328 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,329 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,329 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,329 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,330 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,330 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,330 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,330 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,331 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,331 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,332 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,332 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,332 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=14, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 25.88 ms 2025-09-06T01:52:34,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-161-1 sequence 0 (150), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 149 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,347 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,347 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,349 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,349 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,350 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,351 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,352 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,352 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,353 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,353 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,353 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,354 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,354 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,355 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,355 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,358 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,359 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,377 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=15, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 28.25 ms 2025-09-06T01:52:34,377 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-29-1 sequence 0 (18), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 17 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,378 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,378 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,380 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,380 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,382 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,383 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,383 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,383 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,383 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,384 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,385 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,417 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=16, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 36.25 ms 2025-09-06T01:52:34,417 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-32-1 sequence 0 (21), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 20 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,417 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,418 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,419 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,420 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,423 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,424 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,424 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,425 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,425 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,426 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,426 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,427 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,427 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,427 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,427 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,428 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,428 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,429 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,429 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,430 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,431 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,431 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,431 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,432 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,432 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,432 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,432 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,433 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,433 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=17, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 19.94 ms 2025-09-06T01:52:34,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-30-1 sequence 0 (19), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 18 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,454 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,454 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,454 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,455 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,455 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,455 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,456 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,457 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,459 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,458 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,460 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,461 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,462 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,462 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,463 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,463 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,464 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,464 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=18, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 17.61 ms 2025-09-06T01:52:34,464 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,464 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-39-1 sequence 0 (28), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 27 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,467 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,479 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,479 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,484 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,484 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,484 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,485 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,485 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,485 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=19, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.05 ms 2025-09-06T01:52:34,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-19-1 sequence 0 (8), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 7 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,506 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,506 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,508 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,508 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,510 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,510 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,510 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,512 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,512 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,513 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,513 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,513 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,546 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=20, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 37.32 ms 2025-09-06T01:52:34,546 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-22-1 sequence 0 (11), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 10 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,547 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,547 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,549 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,549 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,550 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,551 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,552 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,553 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,553 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,554 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,554 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,556 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,576 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=21, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.88 ms 2025-09-06T01:52:34,577 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-58-1 sequence 0 (47), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 46 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,578 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,579 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,581 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,581 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,584 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,585 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,585 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,586 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,586 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,587 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,588 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,588 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,589 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,589 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,602 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=22, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 20.83 ms 2025-09-06T01:52:34,602 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-27-1 sequence 0 (16), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 15 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,603 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,603 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,604 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,604 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,606 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,606 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,606 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,607 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,608 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,609 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,609 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,609 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,609 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,609 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,610 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,610 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,610 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,610 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,610 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,611 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,611 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,611 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,611 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,612 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,612 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,612 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,612 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,613 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,613 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,613 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,614 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,614 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,614 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,615 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,615 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,615 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,615 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,616 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,616 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,616 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,617 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,617 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,617 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,618 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,618 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,619 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,624 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,624 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,625 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,625 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,625 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,626 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,626 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,627 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,627 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,627 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=23, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.15 ms 2025-09-06T01:52:34,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,628 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,629 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,629 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,629 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,629 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,630 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,630 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,630 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,631 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,631 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,631 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,631 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,632 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,632 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,632 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,633 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,634 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-103-1 sequence 0 (92), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 91 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,635 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,635 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,636 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,638 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,638 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,639 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,639 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,642 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,642 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,648 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,648 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,649 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,650 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,680 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=24, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 37.61 ms 2025-09-06T01:52:34,680 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-167-1 sequence 0 (156), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 155 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,682 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,683 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,684 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,685 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,686 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,687 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,709 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=25, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.63 ms 2025-09-06T01:52:34,710 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-64-1 sequence 0 (53), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 52 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,711 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,711 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,712 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,713 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,714 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,715 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,716 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,716 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,716 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,717 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,717 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,717 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,718 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,717 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,718 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,719 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,719 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,720 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,720 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,721 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,721 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,740 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=26, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 27.16 ms 2025-09-06T01:52:34,740 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-25-1 sequence 0 (14), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 13 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,741 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,741 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,742 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,743 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,752 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,752 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,762 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=27, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 19.37 ms 2025-09-06T01:52:34,794 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-312-1 sequence 0 (301), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 300 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,795 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,795 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,796 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,797 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,798 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,799 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,799 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,800 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,800 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,801 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,817 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=28, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 20.58 ms 2025-09-06T01:52:34,817 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-23-1 sequence 0 (12), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 11 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,820 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,820 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,823 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,823 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,828 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,828 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,829 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,830 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,830 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,832 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,832 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,832 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,833 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,855 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=29, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 31.37 ms 2025-09-06T01:52:34,855 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-27-1 sequence 0 (16), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 15 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,856 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,856 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,857 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,858 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,859 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,859 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,859 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,860 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,860 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,860 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,861 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,861 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,861 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,862 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,862 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,862 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,862 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,862 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,863 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,864 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,864 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,864 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,864 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,865 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,865 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,865 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,865 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,865 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,866 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,868 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,868 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,868 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,869 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,869 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,869 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,869 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,869 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,870 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,870 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,870 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,870 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,870 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,871 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,871 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,871 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,871 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,872 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,872 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,872 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,872 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,873 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,873 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,873 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,874 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,875 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,890 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=30, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 32.23 ms 2025-09-06T01:52:34,890 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-98-1 sequence 0 (87), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 86 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,890 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,891 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,891 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,891 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,892 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,892 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,893 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,894 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,894 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,894 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,894 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,894 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=31, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 30.58 ms 2025-09-06T01:52:34,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-24-1 sequence 0 (13), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 12 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,924 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,925 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,929 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,929 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,929 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,930 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,931 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,931 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,933 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=32, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 18.51 ms 2025-09-06T01:52:34,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-33-1 sequence 0 (22), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 21 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,946 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,947 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,953 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,954 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,954 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,954 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=33, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.63 ms 2025-09-06T01:52:34,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-23-1 sequence 0 (12), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 11 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:34,971 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,971 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:34,972 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:34,972 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:34,973 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,974 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,975 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,975 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,976 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,976 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:34,977 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:34,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,037 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=34, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 64.25 ms 2025-09-06T01:52:35,038 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-23-1 sequence 0 (12), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 11 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,042 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,042 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,045 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,046 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,047 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,047 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,047 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,048 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,050 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,052 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,053 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,054 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,062 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,062 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,062 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,062 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,062 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,063 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,063 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,063 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,063 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,063 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,064 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,064 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,064 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,065 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,066 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,067 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=35, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 24.40 ms 2025-09-06T01:52:35,067 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-252-1 sequence 0 (241), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 240 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,069 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,069 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,074 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,074 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,080 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,080 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,080 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,081 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,081 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,082 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,083 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,092 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,094 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,104 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=36, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 30.10 ms 2025-09-06T01:52:35,104 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-24-1 sequence 0 (13), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 12 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,105 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,105 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,107 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,108 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,112 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,112 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,113 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,113 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,150 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=37, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 42.51 ms 2025-09-06T01:52:35,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-14-1 sequence 0 (3), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 2 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,153 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,154 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,192 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=38, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 38.26 ms 2025-09-06T01:52:35,192 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,193 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,193 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,194 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,194 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,200 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,201 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,201 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,201 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=39, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 27.20 ms 2025-09-06T01:52:35,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-21-1 sequence 0 (10), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 9 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,229 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,229 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,243 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,243 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,244 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,245 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,268 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,271 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=40, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 41.85 ms 2025-09-06T01:52:35,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-31-1 sequence 0 (20), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 19 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,275 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,275 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,288 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,289 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,289 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,290 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,290 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,290 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,294 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,298 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,299 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,323 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=41, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 47.63 ms 2025-09-06T01:52:35,323 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,325 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,325 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,326 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,327 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,327 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,328 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,328 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,329 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,329 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,329 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,329 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,330 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,345 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=42, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 20.19 ms 2025-09-06T01:52:35,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-31-1 sequence 0 (20), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 19 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,347 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,347 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,348 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,349 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,356 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,356 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,359 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,359 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,359 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,360 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,360 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,360 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,360 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,360 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,371 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=43, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.61 ms 2025-09-06T01:52:35,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-511-1 sequence 0 (500), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 499 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,377 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,377 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,378 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,378 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,379 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,399 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=44, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 22.54 ms 2025-09-06T01:52:35,399 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-15-1 sequence 0 (4), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 3 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,400 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,400 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,401 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,401 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,403 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,403 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,404 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,404 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,404 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,404 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,404 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,405 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,438 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=45, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 36.85 ms 2025-09-06T01:52:35,438 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-35-1 sequence 0 (24), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 23 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,439 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,441 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,444 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,444 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,444 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,445 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,445 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,445 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,445 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,446 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,446 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=46, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.52 ms 2025-09-06T01:52:35,477 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-111-1 sequence 0 (100), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 99 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,480 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,480 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,481 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,482 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,483 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,483 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,484 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,485 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,486 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,486 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,486 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,486 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,487 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,488 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,488 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,488 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,489 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,489 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,489 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,490 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,490 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,491 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,491 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,491 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,492 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,492 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,492 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,493 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,493 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,493 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,493 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,494 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,494 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,494 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,494 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,495 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,495 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,496 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,496 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,496 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,497 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,497 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,497 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,497 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,498 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,498 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,498 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,498 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,499 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,499 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,500 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,500 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,500 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,500 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,501 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,501 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,501 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,501 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,502 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,502 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,502 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,502 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,503 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,503 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,503 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,503 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,504 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,504 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,504 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,504 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,505 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,505 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,506 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,507 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,507 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,507 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,507 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,507 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,508 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,508 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,508 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,508 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,509 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,509 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,509 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,509 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,509 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,510 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,510 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,510 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,511 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,511 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,511 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,519 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=47, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 37.47 ms 2025-09-06T01:52:35,519 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-276-1 sequence 0 (265), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 264 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,521 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,521 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,522 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,522 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,522 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,523 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,523 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,523 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,523 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,524 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,525 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,525 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,525 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,525 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,526 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,527 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,528 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,529 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,531 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,532 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,534 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,535 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,537 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,538 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,539 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,540 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,541 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,542 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,543 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,544 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,546 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,547 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,548 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,549 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,550 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,551 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,552 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,553 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,554 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,554 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,554 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,554 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,554 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,555 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,555 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,555 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,556 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,559 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=48, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.61 ms 2025-09-06T01:52:35,560 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-122-1 sequence 0 (111), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 110 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,560 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,560 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,567 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,567 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,570 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,571 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,572 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,572 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,573 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,574 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,574 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,597 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=49, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 29.80 ms 2025-09-06T01:52:35,597 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-128-1 sequence 0 (117), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 116 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,597 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,597 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,598 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,599 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,599 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,599 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,600 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,600 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,600 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,601 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,601 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,601 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,602 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,602 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,602 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,603 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,603 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,603 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,603 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,603 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,604 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,606 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,606 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,606 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,606 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,607 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,635 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=50, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 36.59 ms 2025-09-06T01:52:35,635 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-34-1 sequence 0 (23), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 22 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,636 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,636 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,639 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,640 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,641 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,642 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,642 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,643 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,643 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,643 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,643 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,644 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,644 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,644 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,644 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,645 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,645 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,646 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,647 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,648 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,649 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,672 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=51, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 33.75 ms 2025-09-06T01:52:35,672 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-93-1 sequence 0 (82), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 81 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,672 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,672 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,674 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,674 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,675 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,676 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,676 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,676 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,676 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,677 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,705 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=52, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 30.78 ms 2025-09-06T01:52:35,705 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,706 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,706 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,708 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,708 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,709 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,709 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,710 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,710 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,710 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,711 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,711 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,711 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,712 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,712 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,713 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,713 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,714 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,715 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,716 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,716 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,716 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,716 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,717 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,717 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,717 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,717 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,718 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,718 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,718 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,719 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,719 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,764 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=53, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 55.91 ms 2025-09-06T01:52:35,764 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-57-1 sequence 0 (46), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 45 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,765 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,765 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,767 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,767 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,768 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,769 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,800 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=54, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 32.67 ms 2025-09-06T01:52:35,800 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,801 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,801 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,802 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,803 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,804 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,804 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,804 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,804 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,804 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,805 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,842 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=55, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 39.42 ms 2025-09-06T01:52:35,842 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,843 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,843 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,845 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,845 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,848 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,848 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,849 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,849 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,853 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,853 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,853 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,853 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,880 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=56, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 34.67 ms 2025-09-06T01:52:35,880 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-110-1 sequence 0 (99), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 98 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,880 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,880 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,882 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,883 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,884 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,884 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,884 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,884 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,884 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,885 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,885 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,886 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,886 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,886 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,887 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,888 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,888 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,888 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,889 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,889 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,889 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,889 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,889 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,890 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,891 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,892 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,892 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,893 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,893 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,893 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,894 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,894 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,901 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,901 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,901 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,901 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,912 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=57, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 29.12 ms 2025-09-06T01:52:35,912 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-137-1 sequence 0 (126), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 125 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,912 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,912 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,914 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,915 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,915 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,916 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,916 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,916 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,917 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,917 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,917 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,917 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,917 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,918 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=58, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 28.83 ms 2025-09-06T01:52:35,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-25-1 sequence 0 (14), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 13 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,955 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,956 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,957 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,957 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,984 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=59, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 28.39 ms 2025-09-06T01:52:35,987 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-14-1 sequence 0 (3), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 2 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:35,987 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,988 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:35,989 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:35,989 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:35,990 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,990 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,990 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,992 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,993 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,994 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,995 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,996 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,996 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,996 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,996 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:35,996 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,997 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,997 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,997 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,998 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,998 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,999 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,999 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:35,999 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,000 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,014 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=60, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 24.45 ms 2025-09-06T01:52:36,014 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-65-1 sequence 0 (54), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 53 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,015 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,015 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,017 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,017 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,018 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,019 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,020 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=61, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 29.48 ms 2025-09-06T01:52:36,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-21-1 sequence 0 (10), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 9 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,049 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,049 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,054 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,054 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,054 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,082 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=62, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 33.12 ms 2025-09-06T01:52:36,083 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-16-1 sequence 0 (5), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 4 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,083 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,083 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,084 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,084 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,085 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,085 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,085 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,085 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,085 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,086 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,086 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,086 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,086 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,087 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,087 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,087 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,087 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,088 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=63, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.28 ms 2025-09-06T01:52:36,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-30-1 sequence 0 (19), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 18 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,122 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,123 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,124 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,124 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,124 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,125 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,125 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,126 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,126 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,127 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,127 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,128 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,129 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,163 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=64, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 40.30 ms 2025-09-06T01:52:36,163 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-26-1 sequence 0 (15), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 14 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,164 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,164 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,165 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,166 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,166 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,167 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,167 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,167 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,195 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=65, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 29.51 ms 2025-09-06T01:52:36,195 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-15-1 sequence 0 (4), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 3 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,196 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,196 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,198 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,198 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,199 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,199 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,199 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,201 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,232 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=66, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 33.91 ms 2025-09-06T01:52:36,232 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-15-1 sequence 0 (4), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 3 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,233 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,233 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,234 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,235 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,235 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,235 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,235 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,236 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,236 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,236 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,236 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,236 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,237 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,270 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=67, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.02 ms 2025-09-06T01:52:36,270 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-26-1 sequence 0 (15), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 14 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,270 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,270 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,273 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,273 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,274 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,276 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,276 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,276 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,276 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,277 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,277 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,277 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,277 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,278 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,278 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,278 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,278 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,279 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,294 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,295 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,297 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=68, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.47 ms 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,297 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,298 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,299 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,299 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,300 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,301 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,302 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,303 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,304 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,307 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,308 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,309 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,309 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,309 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,310 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,310 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,310 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,310 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,310 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,311 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,311 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,311 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,311 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,312 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,313 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,314 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-154-1 sequence 0 (143), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 142 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,314 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,314 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,315 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,326 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,326 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,347 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,348 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,352 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,352 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,353 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,354 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,354 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,354 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,354 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,355 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,371 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=69, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 44.59 ms 2025-09-06T01:52:36,371 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-289-1 sequence 0 (278), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 277 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,372 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,372 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,374 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,374 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,374 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,375 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,375 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,375 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,375 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,376 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,396 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=70, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.42 ms 2025-09-06T01:52:36,396 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-40-1 sequence 0 (29), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 28 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,397 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,397 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,399 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,399 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,400 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,400 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,401 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,401 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,401 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,401 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,401 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,402 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,402 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,402 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,402 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,402 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=71, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 27.62 ms 2025-09-06T01:52:36,427 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-52-1 sequence 0 (41), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 40 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,427 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,427 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,432 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,432 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,432 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,432 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,433 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,433 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,434 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,434 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,457 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=72, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 28.19 ms 2025-09-06T01:52:36,457 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-56-1 sequence 0 (45), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 44 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,458 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,458 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,459 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,459 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,460 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,461 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,461 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,462 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,462 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,463 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=73, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.92 ms 2025-09-06T01:52:36,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-29-1 sequence 0 (18), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 17 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,490 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,530 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=74, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 41.46 ms 2025-09-06T01:52:36,531 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-13-1 sequence 0 (2), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 1 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,531 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,531 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,533 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,534 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,534 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,535 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,535 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,563 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=75, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 29.30 ms 2025-09-06T01:52:36,563 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-17-1 sequence 0 (6), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 5 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,563 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,564 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,565 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,565 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,566 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,567 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,567 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,568 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,568 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,568 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,568 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,568 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,569 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,569 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,569 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,569 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,569 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,570 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=76, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 27.57 ms 2025-09-06T01:52:36,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-39-1 sequence 0 (28), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 27 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,593 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,595 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,595 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,603 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,604 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,604 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,605 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,605 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,606 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,606 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,607 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,608 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,609 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=77, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 42.48 ms 2025-09-06T01:52:36,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-55-1 sequence 0 (44), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 43 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,638 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,639 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,640 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,641 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,641 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,642 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,642 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,642 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,642 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,643 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,643 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,643 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,643 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,644 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,644 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,646 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,660 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=78, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 20.81 ms 2025-09-06T01:52:36,660 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-55-1 sequence 0 (44), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 43 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,661 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,661 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,662 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,663 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,664 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,665 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,665 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,666 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,669 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,669 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,669 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,669 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,697 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=79, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 34.47 ms 2025-09-06T01:52:36,697 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-33-1 sequence 0 (22), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 21 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,698 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,698 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,700 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,701 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,702 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,738 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=80, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 37.79 ms 2025-09-06T01:52:36,738 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-13-1 sequence 0 (2), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 1 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,739 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,739 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,741 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,741 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,742 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,742 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,742 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,743 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,743 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,743 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,743 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,744 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,744 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,744 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,744 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,745 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,745 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,745 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,776 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=81, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 34.94 ms 2025-09-06T01:52:36,776 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-39-1 sequence 0 (28), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 27 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,776 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,777 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,778 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,778 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,779 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,779 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,779 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,780 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,780 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,780 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,781 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,781 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,781 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,782 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,782 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,783 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,783 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,783 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,784 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,784 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,784 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,784 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,784 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,785 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,817 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=82, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 39.47 ms 2025-09-06T01:52:36,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-36-1 sequence 0 (25), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 24 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,820 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,820 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,820 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,821 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,821 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,821 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,822 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,822 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,868 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=83, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 47.17 ms 2025-09-06T01:52:36,869 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-15-1 sequence 0 (4), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 3 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,870 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,870 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,872 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,872 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,876 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,877 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,877 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,878 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,894 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=84, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 22.19 ms 2025-09-06T01:52:36,894 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-176-1 sequence 0 (165), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 164 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,894 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,894 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,896 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,896 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,897 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,897 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,897 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,898 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,898 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,898 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,898 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,899 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,919 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=85, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 23.32 ms 2025-09-06T01:52:36,920 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-45-1 sequence 0 (34), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 33 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,920 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,920 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,921 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,921 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,922 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,922 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,922 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,923 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,923 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,923 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=86, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 21.94 ms 2025-09-06T01:52:36,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-20-1 sequence 0 (9), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 8 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:36,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,944 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:36,945 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:36,946 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:36,947 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,947 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,947 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,948 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,948 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,948 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,949 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,951 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,950 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,951 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,952 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,952 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,963 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,964 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,967 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,967 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,968 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,968 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,969 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,969 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,970 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,970 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,970 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,971 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,971 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,972 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,972 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,973 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,973 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,973 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,974 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,974 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,975 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,975 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,976 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,976 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,977 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,977 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,978 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,978 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,978 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,979 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,979 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,979 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,979 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,980 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,980 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,980 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,980 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,981 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,981 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,981 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,982 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,982 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=87, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.89 ms 2025-09-06T01:52:36,982 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,983 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,983 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,983 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,984 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,984 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,984 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,985 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,985 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,986 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,986 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,986 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,986 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,987 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,987 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,987 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,987 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,988 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,988 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,988 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,988 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,989 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,989 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,989 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,989 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,990 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,990 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,991 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,991 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:36,992 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:36,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,061 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-566-1 sequence 0 (555), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 554 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,061 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,061 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,062 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,062 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,063 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,064 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,064 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,064 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,064 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,064 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,101 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=88, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 38.70 ms 2025-09-06T01:52:37,101 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-19-1 sequence 0 (8), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 7 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,101 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,102 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,103 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,103 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,108 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,108 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,108 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,108 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,108 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,109 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,110 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,111 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,147 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=89, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 44.45 ms 2025-09-06T01:52:37,148 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-56-1 sequence 0 (45), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 44 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,148 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,148 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,150 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,151 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,151 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,151 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,152 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,152 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,185 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=90, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 35.93 ms 2025-09-06T01:52:37,185 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-49-1 sequence 0 (38), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 37 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,186 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,186 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,186 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,186 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,187 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,187 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,188 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,188 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,188 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,189 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,189 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,190 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,191 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,192 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,192 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,193 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,193 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,194 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,194 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,194 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,194 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,195 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,195 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,195 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,195 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,195 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,196 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,196 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,196 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,196 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,196 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,197 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,198 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,198 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,198 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,258 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=91, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 71.32 ms 2025-09-06T01:52:37,259 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-308-1 sequence 0 (297), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 296 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,260 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,260 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,261 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,261 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,272 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,273 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,273 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,273 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,309 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=92, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 47.16 ms 2025-09-06T01:52:37,309 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-3-datastore-config-fe-1-txn-209-1 sequence 0 (198), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 197 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:552) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:382) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:329) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:344) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:291) ~[bundleFile:11.0.0] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:52:37,310 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,310 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:52:37,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-09-06T01:52:37,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:52:37,314 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,314 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,315 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,315 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,315 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,315 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,316 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,316 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,316 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,316 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,316 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,317 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,317 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,317 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,317 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,317 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,318 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,318 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,319 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,319 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,319 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,320 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,320 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,320 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,320 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,321 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,321 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,321 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,322 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,322 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,323 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,323 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,324 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,325 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,325 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,326 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,327 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,329 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,329 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,333 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,333 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,334 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,334 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,334 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,339 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=93, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.61 ms 2025-09-06T01:52:37,340 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,340 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,340 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,340 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,343 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,344 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,344 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,344 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,345 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,345 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,346 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,346 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:37,346 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:459) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:440) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:402) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.0] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.UnsupportedOperationException at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:363) ~[bundleFile:11.0.0] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.2.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.0] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.0] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.0] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.0] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:385) ~[bundleFile:11.0.0] ... 2 more 2025-09-06T01:52:37,347 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Previous action failed 2025-09-06T01:52:39,794 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:52:39,794 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:52:48,824 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | Leader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Leader): At least 1 followers need to be active, Switching member-3-shard-inventory-config from Leader to IsolatedLeader 2025-09-06T01:52:48,826 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-inventory-config (Leader) :- Switching from behavior Leader to IsolatedLeader, election term: 6 2025-09-06T01:52:48,826 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.0 | RoleChangeNotifier for member-3-shard-inventory-config , received role change from Leader to IsolatedLeader 2025-09-06T01:52:48,826 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received role changed for member-3-shard-inventory-config from Leader to IsolatedLeader 2025-09-06T01:52:54,854 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:52:54,854 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:53:09,913 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:53:09,913 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:53:24,973 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:53:24,974 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:53:37,344 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T01:53:37,345 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: refreshing backend for shard 1 2025-09-06T01:53:37,506 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 179 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:53:40,034 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:53:40,035 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:53:42,509 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 175 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:53:47,511 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 170 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:53:52,513 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 166 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:53:55,084 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:53:55,084 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:53:57,365 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-3-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:628) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:242) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-09-06T01:53:57,515 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 161 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:02,518 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 157 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:07,523 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 152 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:10,134 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:54:10,134 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:54:12,526 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 148 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:17,530 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 143 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:18,404 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-3-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:628) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:242) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-09-06T01:54:22,532 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 139 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:25,184 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:54:25,184 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:54:27,533 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 134 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:32,535 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 130 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:37,537 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 125 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:39,444 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-3-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:628) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:242) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-09-06T01:54:40,244 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:54:40,244 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:54:42,538 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 120 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:47,540 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 115 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:52,541 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 110 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:54:55,293 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:54:55,294 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:54:57,543 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 105 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:02,545 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 100 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:07,547 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 95 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:10,355 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:55:10,356 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:55:12,548 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 90 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:17,550 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 85 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:22,551 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 80 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:25,404 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:55:25,404 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:55:27,554 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 75 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:32,556 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 70 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:37,557 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 65 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:40,464 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:55:40,464 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:55:42,559 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 60 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:47,562 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 55 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:52,565 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 50 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:55:55,524 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:55:55,524 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:55:57,566 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 45 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:02,570 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 40 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:07,572 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 35 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:10,564 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-09-06T01:56:10,564 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:56:12,573 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 30 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:17,575 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 25 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:22,578 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 20 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:25,624 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:56:25,626 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:56:27,582 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 15 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:32,584 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 10 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:37,586 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 5 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T01:56:40,684 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:56:40,684 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:56:55,734 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:56:55,734 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:57:10,773 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15039 ms in state COMMIT_PENDING 2025-09-06T01:57:10,774 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:57:25,824 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:57:25,824 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:57:40,883 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:57:40,884 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:57:55,924 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15040 ms in state COMMIT_PENDING 2025-09-06T01:57:55,924 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:58:10,983 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:58:10,984 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:58:26,033 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-09-06T01:58:26,034 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:58:41,094 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15060 ms in state COMMIT_PENDING 2025-09-06T01:58:41,095 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:58:56,154 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15059 ms in state COMMIT_PENDING 2025-09-06T01:58:56,154 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:59:11,204 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Current transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 has timed out after 15049 ms in state COMMIT_PENDING 2025-09-06T01:59:11,204 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: Transaction member-1-datastore-config-fe-1-chn-6-txn-0-1 is still committing, cannot abort 2025-09-06T01:59:14,173 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow 2025-09-06T01:59:14,629 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow 2025-09-06T01:59:15,064 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations 2025-09-06T01:59:15,138 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:16,155 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:17,174 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:18,194 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:19,214 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:20,234 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:21,254 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:22,273 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$t], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:23,294 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$u], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:24,314 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$v], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:25,334 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$w], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:26,354 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$x], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:27,374 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:28,396 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:29,415 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$A], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:30,434 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$B], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:31,454 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$C], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:32,474 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$D], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:33,494 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$E], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:34,514 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$F], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:35,535 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$G], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:36,554 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$H], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:37,574 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$I], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:38,594 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$J], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:39,614 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$K], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:40,635 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$L], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:41,654 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$M], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:42,674 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$N], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:43,695 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$O], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:44,714 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$P], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:45,735 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:46,754 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$R], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:47,774 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$S], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:48,795 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$T], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:49,814 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$U], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:50,834 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$V], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:51,854 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$W], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:52,874 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$X], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:53,894 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:54,914 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:55,934 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:56,954 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:57,975 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T01:59:58,994 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:00,015 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:01,035 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:02,055 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:03,074 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:04,094 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:05,114 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:06,134 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:07,154 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:08,176 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:09,194 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:10,215 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:11,235 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:12,256 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:13,275 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:14,295 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:15,315 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:16,335 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:17,355 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:18,375 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:19,394 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:20,415 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:21,435 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:22,454 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:23,475 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:24,495 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:25,515 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:26,535 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:27,555 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:28,575 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:29,595 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:30,616 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:31,635 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:32,655 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:33,675 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:34,695 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:35,715 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:36,735 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:37,755 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:38,775 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:39,796 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:40,815 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:41,834 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:42,855 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:43,875 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:44,895 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:45,915 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:46,935 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:47,956 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:48,975 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:49,995 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:51,015 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:52,035 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:53,055 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:54,075 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:55,096 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:56,116 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:57,135 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:58,156 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:00:59,176 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:00,196 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:01,216 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:02,236 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:03,255 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:04,276 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:05,296 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:06,316 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:07,336 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:08,355 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:09,378 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:10,395 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:11,415 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:12,436 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:13,455 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:14,475 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:15,495 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:16,516 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:17,535 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:18,555 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:19,575 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:20,595 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:21,615 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:22,636 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:23,655 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:24,677 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:25,696 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:26,716 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:27,735 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:28,755 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:29,776 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:30,795 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:31,816 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:32,836 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:33,856 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:34,875 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:35,896 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:36,915 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:37,935 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:38,955 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:39,976 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:40,995 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:42,015 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:43,036 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:44,056 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:45,076 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:46,095 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:47,116 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:48,136 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:49,156 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:50,176 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:51,196 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:52,215 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:53,236 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:54,255 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:55,276 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:56,295 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:57,316 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:58,335 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:01:59,356 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:00,376 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:01,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:02,416 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:03,436 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:04,456 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:05,476 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:06,496 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:07,515 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:08,536 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:09,556 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:10,576 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:11,596 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:12,615 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:13,636 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:14,657 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:15,676 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:16,696 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:17,716 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:18,737 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:19,756 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:20,776 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:21,796 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:22,816 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:23,836 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:24,857 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:25,876 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:26,897 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:27,916 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:28,937 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:29,957 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:30,977 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:31,997 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:33,016 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:34,037 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:35,056 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:36,077 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:37,097 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:38,117 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:39,136 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:40,156 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:41,176 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:42,197 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:43,216 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:44,237 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:45,257 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:46,276 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:47,296 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:48,317 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:49,337 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:50,356 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:51,376 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:52,396 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:53,416 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:54,437 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:55,461 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:56,487 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:57,506 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:58,526 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:02:59,547 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:00,567 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:01,588 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:02,606 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:03,627 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:04,647 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:05,671 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:06,687 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:07,706 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:08,726 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:09,747 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:10,767 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:11,788 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:12,806 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:13,827 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:14,847 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:15,867 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:16,887 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:17,907 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:18,927 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:19,947 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:20,967 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:21,987 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:23,007 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:24,028 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:25,046 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:26,067 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:27,086 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$de], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:28,108 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:29,127 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:30,147 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:31,168 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$he], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:32,187 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:33,207 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:34,229 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:35,247 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:36,267 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:37,287 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:38,307 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:39,328 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:40,347 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:41,367 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:42,387 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:43,407 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:44,428 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:45,447 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:46,467 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$we], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:47,487 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:48,507 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:49,527 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:50,546 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:51,567 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:52,588 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:53,607 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$De], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:54,627 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:55,647 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:56,667 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:57,687 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$He], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:58,707 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:03:59,728 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:00,747 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:01,767 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:02,787 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:03,807 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:04,827 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:05,848 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:06,867 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:07,887 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:08,907 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:09,927 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:10,947 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:11,967 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:12,987 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$We], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:14,007 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:15,027 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:16,047 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:17,067 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:18,087 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:19,107 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:20,127 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:21,148 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:22,167 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:23,187 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:24,207 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:25,227 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:26,247 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:27,267 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:28,288 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:29,308 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:30,329 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:31,347 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:32,368 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:33,387 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:34,408 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:35,427 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:36,448 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:37,467 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$if], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:38,488 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:39,508 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:40,528 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:41,547 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:42,567 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:43,588 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:44,608 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:45,628 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:46,648 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:47,667 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:48,687 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:49,707 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:50,727 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:51,747 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:52,768 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:53,787 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:54,807 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:55,828 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:56,847 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:57,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 2071 millis 2025-09-06T02:04:57,867 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:58,887 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:04:59,908 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:00,928 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:01,948 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:02,968 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:03,988 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$If], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:05,007 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:06,028 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:07,048 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:08,068 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:09,088 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:10,108 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:11,128 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:12,148 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:13,168 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:14,188 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:15,208 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:16,227 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:17,248 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:18,268 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:19,287 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:20,308 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:21,328 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:22,348 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:23,368 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:24,387 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:25,408 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:26,428 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:27,447 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:28,468 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:29,489 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:30,509 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:31,528 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:32,549 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:33,568 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:34,587 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ag], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:35,607 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:36,628 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:37,648 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:38,668 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$eg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:39,688 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:40,708 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:41,727 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:42,748 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ig], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:43,768 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:44,788 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:45,809 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:46,828 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:47,847 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ng], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:48,868 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$og], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:49,888 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:50,909 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:51,928 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:52,947 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:53,968 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:54,988 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ug], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:55,740 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations 2025-09-06T02:05:56,009 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:56,322 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations 2025-09-06T02:05:56,672 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node 2025-09-06T02:05:57,028 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:58,047 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:05:59,068 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:00,087 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:01,108 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ag], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:02,129 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:03,149 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:04,168 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:05,188 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Eg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:06,207 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:07,228 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:08,249 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:09,268 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ig], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:10,288 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:11,308 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:12,328 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:13,348 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:14,368 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ng], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:15,388 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Og], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:16,408 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:17,429 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:18,448 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:19,468 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:20,488 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:21,509 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ug], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:22,529 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:23,549 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:24,569 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:25,589 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:26,229 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.171.248:2550: 2842 millis 2025-09-06T02:06:26,608 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:27,629 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:28,648 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:29,668 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:30,688 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:31,709 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:32,729 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:33,748 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:34,769 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:35,788 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:36,809 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:37,828 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:38,848 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:39,884 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ah], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:40,899 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:41,919 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ch], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:42,938 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:43,959 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$eh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:44,979 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:45,998 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:47,018 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:48,039 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ih], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:49,059 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:50,079 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:51,098 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:52,119 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:53,138 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:54,158 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:55,178 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ph], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:56,199 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:57,218 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:58,238 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:06:59,259 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$th], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:00,279 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:01,300 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:02,319 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:03,338 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:04,359 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:05,378 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:06,399 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ah], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:07,418 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:08,438 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ch], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:09,459 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:10,479 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Eh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:11,881 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:12,898 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:13,918 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:14,939 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ih], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:15,958 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:16,978 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:17,999 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:19,019 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:20,039 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:21,059 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:22,079 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ph], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:23,099 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:24,119 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:25,139 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:26,158 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Th], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:27,179 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:28,199 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:29,219 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:30,238 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:31,258 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:32,279 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:33,299 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:34,318 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:35,338 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:36,360 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:37,380 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:38,399 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:39,418 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:40,438 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:41,459 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:42,479 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:43,499 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:44,519 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:45,539 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ai], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:46,558 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:47,579 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ci], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:48,599 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$di], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:49,619 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ei], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:50,639 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:50,898 | INFO | ForkJoinPool-10-worker-1 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 6 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T02:07:51,659 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:52,679 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:53,698 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ii], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:54,718 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ji], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:55,738 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ki], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:56,759 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$li], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:57,778 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:58,798 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ni], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:07:59,819 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:00,839 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:01,859 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:02,879 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ri], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:03,900 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$si], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:04,920 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ti], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:05,939 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ui], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:06,959 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:07,979 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:08,999 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:10,019 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:11,039 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:12,059 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ai], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:13,079 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:14,100 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ci], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:15,120 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Di], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:16,139 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ei], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:17,305 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:18,319 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:19,339 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:21,324 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ii], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:22,339 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ji], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:24,134 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ki], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:25,149 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Li], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:26,169 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:27,190 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ni], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:28,209 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:28,552 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T02:08:28,552 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T02:08:28,554 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T02:08:35.553672015Z. 2025-09-06T02:08:29,144 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)]. 2025-09-06T02:08:29,229 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:30,249 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:31,269 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ri], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:32,289 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Si], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:33,309 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ti], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:34,330 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ui], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:35,349 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:36,075 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.171.248:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.195:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.248:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (3), pekko://opendaylight-cluster-data@10.30.171.195:2550 -> pekko://opendaylight-cluster-data@10.30.171.248:2550: Unreachable [Unreachable] (3)] 2025-09-06T02:08:36,075 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.248:2550,5423515263267496285)] 2025-09-06T02:08:36,076 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] as [Down] 2025-09-06T02:08:36,076 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberDowned] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-1849545225] was unhandled. [269] dead letters encountered, of which 258 were not logged. The counter will be reset now. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:08:36,076 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.apache.pekko.cluster.ClusterEvent$MemberDowned] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#58666379] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:08:36,076 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-09-06T02:08:43.076433502Z. 2025-09-06T02:08:36,370 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:37,107 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 12972, lastApplied : -1, commitIndex : -1 2025-09-06T02:08:37,107 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=6, success=true, followerId=member-2-shard-default-operational, logLastIndex=108, logLastTerm=6, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 12972, lastApplied : 108, commitIndex : 108 2025-09-06T02:08:37,107 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 12972, lastApplied : -1, commitIndex : -1 2025-09-06T02:08:37,107 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.0 | member-3-shard-default-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-default-config, logLastIndex=184, logLastTerm=4, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 12973, lastApplied : 184, commitIndex : 184 2025-09-06T02:08:37,108 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.248:2550,5423515263267496285)] 2025-09-06T02:08:37,295 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T02:08:37,295 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T02:08:37,296 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.171.248:2550 2025-09-06T02:08:37,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Association to [pekko://opendaylight-cluster-data@10.30.171.248:2550] with UID [5423515263267496285] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-09-06T02:08:37,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,389 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:37,446 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,447 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,657 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,658 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,807 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:37,807 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,120 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,120 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,177 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,177 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,178 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,179 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,409 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:38,696 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,696 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,696 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,697 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,697 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,697 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,697 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,697 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,926 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,927 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:38,927 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | InboundActorRefCompression | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Inbound message from originUid [5423515263267496285] is using unknown compression table version. It may have been sent with compression table built for previous incarnation of this system. Versions activeTable: 0, nextTable: 1, incomingTable: 31 2025-09-06T02:08:39,430 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:40,449 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:40,679 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:40,679 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], control stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:41,196 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:41,469 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:41,716 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:42,489 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:42,756 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:43,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:43,276 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], control stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:43,509 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:43,796 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:44,314 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Member removed [pekko://opendaylight-cluster-data@10.30.171.248:2550] 2025-09-06T02:08:44,529 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:45,356 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:45,550 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:45,876 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:46,400 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:46,570 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:46,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:47,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:47,589 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:47,956 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:48,476 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:48,609 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:48,995 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:49,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:49,629 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:50,035 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:50,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:50,650 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:51,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:51,669 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:52,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:52,636 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:52,690 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$aj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:53,675 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:53,709 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:54,196 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:54,729 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:55,749 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:56,276 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:56,769 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ej], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:57,789 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:58,356 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:08:58,810 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:08:59,830 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:00,849 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ij], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:01,466 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:01,869 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:02,890 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:03,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:03,909 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:04,066 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:04,586 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:04,929 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:05,950 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:06,146 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:06,970 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:07,185 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:07,990 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:09,010 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:09,787 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:10,029 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:10,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:11,049 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:11,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:12,070 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:12,907 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:13,089 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:13,949 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:14,109 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:14,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:15,129 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:16,024 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-topology-config#-1103323626] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-toaster-config#972328392] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-3-shard-default-operational#-1082326140] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-topology-config#-1103323626] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,025 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-toaster-config#972328392] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,026 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,026 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:09:16,027 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:16,149 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:16,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:17,065 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:17,169 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:18,190 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:18,626 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:19,209 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Aj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:20,229 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:20,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:21,249 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:21,746 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:22,266 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:22,270 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:23,290 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ej], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:23,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:24,309 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:24,347 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:24,866 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:25,329 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:25,386 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:26,350 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:26,428 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:27,369 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ij], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:27,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: Association$OutboundStreamStopQuarantinedSignal$: 2025-09-06T02:09:28,389 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:28,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:28,506 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], control stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:29,025 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:29,410 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:29,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:30,429 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:31,450 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:32,470 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:33,490 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:33,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:34,226 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:34,510 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:34,735 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:35,530 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:36,550 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:37,336 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:37,570 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:37,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:38,376 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:38,589 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:38,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:39,609 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:39,935 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:40,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:40,630 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:41,650 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:42,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:42,670 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:43,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:43,690 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:44,615 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:44,709 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:45,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:45,729 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:46,749 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:47,215 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:47,769 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:48,776 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:48,790 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:49,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:49,805 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:49,810 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:50,830 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:51,850 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:52,869 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:53,889 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:53,965 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:54,910 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:55,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:55,930 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:56,565 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:56,950 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:57,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:57,970 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ak], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:58,125 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:58,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:09:58,989 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:09:59,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:00,010 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ck], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:00,726 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:01,030 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:02,050 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ek], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:03,072 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:03,328 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:04,091 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:05,109 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:05,395 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:06,129 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ik], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:06,954 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:07,150 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:08,170 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:08,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:09,199 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:10,076 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:10,220 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:11,239 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:12,260 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ok], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:13,289 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:13,706 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:14,310 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:15,329 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:16,350 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:17,370 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:17,857 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:18,390 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:19,409 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:20,430 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:20,455 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:20,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:21,450 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:22,470 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:23,490 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:24,095 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:24,510 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ak], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:25,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:25,530 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:25,656 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:26,549 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ck], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:27,205 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:27,570 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:28,590 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ek], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:29,610 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:30,629 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:31,649 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:32,670 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ik], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:33,690 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:34,709 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:35,730 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:36,046 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:36,750 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:37,076 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:37,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:37,770 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:38,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:38,635 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:38,790 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ok], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:39,809 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:40,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:40,829 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:41,850 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:42,870 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:43,816 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:43,890 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:44,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:44,910 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:45,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:45,930 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:46,416 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:46,936 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:46,951 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:47,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:47,970 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:48,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:48,990 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:49,015 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:49,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:50,010 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:50,578 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:51,030 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:52,050 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:53,070 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:54,090 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:54,216 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:55,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:55,256 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:56,130 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:57,150 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:58,170 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:10:58,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:10:59,190 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:00,210 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:00,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:01,231 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:02,250 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:03,270 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$al], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:04,291 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:05,310 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:06,330 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:06,676 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:07,351 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$el], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:08,236 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:08,370 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:08,754 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:09,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:09,389 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:10,410 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:10,835 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:11,430 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$il], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:12,450 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:13,470 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:14,489 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ll], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:14,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:15,510 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ml], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:16,530 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:17,551 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ol], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:18,570 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:19,590 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ql], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:20,187 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:20,610 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:21,225 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:21,630 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:22,650 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:23,671 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ul], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:24,350 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:24,691 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:25,710 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:25,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:26,730 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:27,750 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:28,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:28,770 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:29,790 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Al], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:30,575 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:30,821 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:31,840 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:32,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:32,655 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:32,860 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:33,696 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:33,880 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$El], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:34,900 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:35,267 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:35,920 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:36,296 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:36,940 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:37,960 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Il], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:38,375 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:38,980 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:40,000 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:40,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:41,020 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ll], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:42,041 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ml], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:43,060 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:44,080 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ol], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:45,100 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:46,120 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ql], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:46,145 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:46,666 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:47,141 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:48,161 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:48,245 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:49,181 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:50,201 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ul], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:50,835 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:51,221 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:51,336 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:52,240 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:52,396 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:53,261 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:53,935 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:54,281 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:55,300 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:56,320 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:57,340 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:57,575 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:58,095 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:58,361 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:59,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:11:59,380 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:11:59,654 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:00,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:00,400 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:00,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:01,420 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:02,440 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:02,765 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:03,285 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:03,461 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:03,806 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:04,480 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:04,845 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:05,501 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:06,520 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:06,925 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:07,541 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:08,485 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:08,561 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$am], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:09,526 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:09,580 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:10,045 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:10,600 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:11,085 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:11,620 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:12,641 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$em], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:13,660 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:14,680 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:14,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:15,235 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:15,700 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:15,755 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:16,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:16,720 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$im], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:16,796 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:17,740 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:18,761 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$km], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:19,780 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:20,426 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:20,800 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:21,821 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:22,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:22,841 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$om], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:23,026 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:23,860 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:24,586 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:24,881 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:25,106 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:25,625 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:25,900 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:26,665 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:26,920 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:27,185 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:27,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:27,940 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:28,960 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$um], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:29,981 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:30,825 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:31,000 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:32,020 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:32,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:32,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:33,041 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ym], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:34,060 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:35,080 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Am], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:36,101 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:36,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:37,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:37,121 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:37,575 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:37,788 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion 2025-09-06T02:12:38,104 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:38,140 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:39,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:39,161 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Em], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:40,191 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:41,210 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:42,231 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:42,775 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:43,251 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Im], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:43,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:43,825 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:44,271 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:44,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:44,846 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:45,291 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Km], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:46,311 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:46,394 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:47,331 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:47,954 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:48,351 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:48,475 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:49,014 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:49,375 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Om], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:49,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:50,390 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:51,411 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:52,430 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:53,451 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:53,665 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:54,185 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:54,471 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:54,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:55,491 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Um], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:55,745 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:56,265 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:56,510 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:57,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:57,531 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:57,825 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:58,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:58,551 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:59,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:12:59,570 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ym], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:12:59,905 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:00,425 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:00,590 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:01,611 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:01,977 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:02,631 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:03,015 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:03,651 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:04,671 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:05,691 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:06,137 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:06,711 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:07,175 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:07,696 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:07,732 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:08,752 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:09,771 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:09,775 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:10,792 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:11,811 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:12,375 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:12,830 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:12,896 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:13,851 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$an], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:14,872 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:15,891 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:16,911 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:17,931 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$en], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:18,085 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:18,951 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:19,971 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:20,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:20,686 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:20,991 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:22,011 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$in], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:22,765 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:23,031 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:23,804 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:24,051 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:25,071 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ln], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:25,365 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:25,885 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:26,090 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:26,405 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:27,111 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:27,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:28,131 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$on], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:29,150 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:30,171 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:31,201 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:31,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:32,221 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:33,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:33,241 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:34,195 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:34,261 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$un], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:35,235 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:35,281 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:36,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:36,301 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:37,321 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:37,835 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:38,341 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:38,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:38,875 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:39,362 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:39,395 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:40,381 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$An], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:41,401 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:41,476 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:42,421 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:43,441 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:43,555 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:44,461 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$En], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:44,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:45,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:45,481 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:46,501 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:47,521 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:47,715 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:48,237 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:48,541 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$In], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:48,755 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:49,561 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:49,795 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:50,581 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:50,835 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:51,602 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ln], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:52,621 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:52,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:53,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:53,641 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:54,475 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:54,661 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$On], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:55,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:55,681 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:56,035 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:56,702 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:57,721 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:58,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:58,635 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:13:58,741 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:13:59,761 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:00,695 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:00,781 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Un], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:01,801 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:02,822 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:03,841 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:04,861 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:05,880 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zn], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:06,902 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:07,425 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:07,920 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:07,945 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:08,465 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:08,941 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:09,002 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:09,505 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:09,961 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:10,536 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:10,981 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:12,001 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:13,021 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:13,136 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:14,042 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:14,695 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:15,061 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:16,081 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:16,765 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:17,101 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:17,804 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:18,121 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:18,325 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:18,845 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:19,141 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ao], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:19,365 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:20,161 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:21,181 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$co], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:22,205 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$do], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:23,221 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$eo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:24,035 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:24,242 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:25,075 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:25,261 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$go], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:26,281 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ho], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:27,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:27,301 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$io], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:28,321 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:29,214 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:29,342 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ko], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:30,361 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:30,775 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:31,381 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:31,814 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:32,335 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:32,401 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$no], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:33,366 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:33,422 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:34,441 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$po], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:34,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:35,461 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:35,954 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:36,481 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ro], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:37,502 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$so], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:38,521 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$to], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:39,075 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:39,541 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:39,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:40,562 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:40,635 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:41,581 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:42,602 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:43,225 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:43,622 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:44,642 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:45,661 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ao], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:45,826 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:46,446 | INFO | ForkJoinPool-10-worker-2 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Capping ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} throttle delay from 70 to 5 seconds java.lang.Throwable: null at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:261) ~[?:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.forwardEntry(SimpleReconnectForwarder.java:18) ~[?:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrForward(TransmitQueue.java:272) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueOrForward(AbstractClientConnection.java:182) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendEntry(AbstractClientConnection.java:256) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.sendRequest(AbstractClientConnection.java:159) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.sendRequest(ProxyHistory.java:570) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.sendRequest(AbstractProxyTransaction.java:321) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.abort(AbstractProxyTransaction.java:440) ~[?:?] at java.util.concurrent.ConcurrentHashMap$ValuesView.forEach(ConcurrentHashMap.java:4783) [?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.commonAbort(AbstractClientHandle.java:85) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractClientHandle.abort(AbstractClientHandle.java:71) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedTransaction.close(ClientBackedTransaction.java:62) [bundleFile:?] at org.opendaylight.controller.cluster.databroker.ClientBackedReadTransaction.close(ClientBackedReadTransaction.java:52) [bundleFile:?] at org.opendaylight.mdsal.dom.spi.AbstractDOMForwardedTransaction.closeSubtransactions(AbstractDOMForwardedTransaction.java:124) [bundleFile:14.0.13] at org.opendaylight.mdsal.dom.spi.DOMForwardedReadOnlyTransaction.close(DOMForwardedReadOnlyTransaction.java:52) [bundleFile:14.0.13] at org.opendaylight.mdsal.binding.dom.adapter.BindingDOMReadTransactionAdapter.close(BindingDOMReadTransactionAdapter.java:46) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:101) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:67) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-09-06T02:14:46,681 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:46,865 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:47,702 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Co], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:47,905 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:48,424 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:48,722 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Do], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:48,945 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:49,741 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Eo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:50,761 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:51,025 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:51,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:51,781 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Go], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:52,069 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:52,585 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:52,801 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ho], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:53,821 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Io], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:54,145 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:54,449 | ERROR | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Queue ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} has not seen progress in 1217 seconds, failing all requests 2025-09-06T02:14:54,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: removed connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, poisoned=org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1217 seconds, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-09-06T02:14:54,456 | ERROR | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, poisoned=org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1217 seconds, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} has been poisoned java.lang.IllegalStateException: Connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, poisoned=org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1217 seconds, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} has been poisoned at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.commonEnqueue(AbstractClientConnection.java:205) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:194) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.replayEntry(SimpleReconnectForwarder.java:24) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueRequest(AbstractClientConnection.java:175) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.enqueueRequest(ProxyHistory.java:565) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueueRequest(AbstractProxyTransaction.java:316) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:631) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.LocalProxyTransaction.handleReplayedRemoteRequest(LocalProxyTransaction.java:124) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.doReplayRequest(AbstractProxyTransaction.java:744) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.replayRequest(AbstractProxyTransaction.java:778) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory$ReconnectCohort.replayEntry(ProxyHistory.java:295) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.BouncingReconnectForwarder.replayEntry(BouncingReconnectForwarder.java:71) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.replayEntry(SimpleReconnectForwarder.java:24) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueRequest(AbstractClientConnection.java:175) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.enqueueRequest(ProxyHistory.java:565) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueueRequest(AbstractProxyTransaction.java:316) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:631) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.LocalProxyTransaction.handleReplayedRemoteRequest(LocalProxyTransaction.java:124) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.doReplayRequest(AbstractProxyTransaction.java:744) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.replayRequest(AbstractProxyTransaction.java:778) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory$ReconnectCohort.replayEntry(ProxyHistory.java:295) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.BouncingReconnectForwarder.replayEntry(BouncingReconnectForwarder.java:71) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.replayEntry(SimpleReconnectForwarder.java:24) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueRequest(AbstractClientConnection.java:175) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.enqueueRequest(ProxyHistory.java:565) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueueRequest(AbstractProxyTransaction.java:316) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:631) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.LocalProxyTransaction.handleReplayedRemoteRequest(LocalProxyTransaction.java:124) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.doReplayRequest(AbstractProxyTransaction.java:744) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.replayRequest(AbstractProxyTransaction.java:778) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory$ReconnectCohort.replayEntry(ProxyHistory.java:295) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.BouncingReconnectForwarder.replayEntry(BouncingReconnectForwarder.java:71) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.replayEntry(SimpleReconnectForwarder.java:24) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueRequest(AbstractClientConnection.java:175) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.enqueueRequest(ProxyHistory.java:565) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueueRequest(AbstractProxyTransaction.java:316) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:631) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.LocalProxyTransaction.handleReplayedRemoteRequest(LocalProxyTransaction.java:124) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.doReplayRequest(AbstractProxyTransaction.java:744) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.replayRequest(AbstractProxyTransaction.java:778) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory$ReconnectCohort.replayEntry(ProxyHistory.java:295) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.BouncingReconnectForwarder.replayEntry(BouncingReconnectForwarder.java:71) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.SimpleReconnectForwarder.replayEntry(SimpleReconnectForwarder.java:24) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.TransmitQueue.enqueueOrReplay(TransmitQueue.java:281) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueEntry(AbstractClientConnection.java:195) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.enqueueRequest(AbstractClientConnection.java:175) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.ProxyHistory.enqueueRequest(ProxyHistory.java:565) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueueRequest(AbstractProxyTransaction.java:316) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:631) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:616) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.enqueuePurge(AbstractProxyTransaction.java:611) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.lambda$abort$1(AbstractProxyTransaction.java:442) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.poison(AbstractClientConnection.java:459) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.runTimer(AbstractClientConnection.java:364) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1217 seconds at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.runTimer(AbstractClientConnection.java:357) ~[bundleFile:?] ... 26 more Caused by: org.opendaylight.controller.cluster.access.concepts.RuntimeRequestException: Backend connection timed out at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.runTimer(AbstractClientConnection.java:341) ~[bundleFile:?] ... 26 more Caused by: java.util.concurrent.TimeoutException at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.runTimer(AbstractClientConnection.java:341) ~[bundleFile:?] ... 26 more 2025-09-06T02:14:54,474 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: stopping resolution of shard 1 on stale connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=1, poisoned=org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1217 seconds, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540], sessionId=94, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) ~[bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) ~[bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-3-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:628) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:242) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] ... 5 more 2025-09-06T02:14:54,475 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=e, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-3595-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=f, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-3851-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=10, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-4107-0, sequence=0}} 2025-09-06T02:14:54,476 | ERROR | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | AbstractClientConnection | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | Queue ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} has not seen progress in 1276 seconds, failing all requests 2025-09-06T02:14:54,476 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: failed to remove connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, cookie=0, poisoned=org.opendaylight.controller.cluster.access.client.NoProgressException: No progress in 1276 seconds, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-default-config#1742182425], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}}, as it was not tracked 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=11, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-4363-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=12, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-4619-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=13, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-4875-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=14, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-5131-0, sequence=0}} 2025-09-06T02:14:54,476 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.0 | member-3-frontend-datastore-config: Ignoring unknown response Envelope{sessionId=1, txSequence=15, message=TransactionPurgeResponse{target=member-3-datastore-config-fe-1-txn-5387-0, sequence=0}} 2025-09-06T02:14:54,478 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RepointableActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.databroker.actors.dds.AbstractDataStoreClientBehavior$$Lambda/0x00000007c1685c28] to Actor[pekko://opendaylight-cluster-data/user/$a#42788464] was not delivered. [46] dead letters encountered, of which 35 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/user/$a#42788464] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:14:54,841 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:55,861 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ko], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:56,881 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:57,787 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:57,902 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:58,306 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:14:58,921 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$No], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:14:59,943 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:00,896 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:00,962 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Po], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:01,981 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:02,456 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:02,976 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:03,002 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ro], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:03,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:04,022 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$So], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:04,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:05,041 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$To], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:05,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:05,575 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:06,062 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:07,081 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:07,656 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:08,102 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:08,695 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:09,121 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:10,141 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:10,774 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:11,162 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zo], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:11,295 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:11,815 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:12,181 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:12,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:13,202 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:14,222 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:14,935 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:15,245 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:15,455 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:16,261 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:17,016 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:17,286 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:17,535 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:18,303 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:19,321 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:19,616 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:20,341 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:20,655 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:21,362 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:22,381 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:23,401 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:24,275 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:24,421 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ap], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:24,796 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:25,442 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:25,835 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:26,462 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:27,395 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:27,481 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:28,425 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:28,502 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ep], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:29,522 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:29,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:30,494 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:30,542 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:31,561 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:32,565 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:32,581 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ip], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:33,601 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:34,622 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:35,641 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:35,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:36,205 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:36,662 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:37,245 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:37,682 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$np], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:38,285 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:38,701 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$op], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:39,722 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:40,364 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:40,741 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:41,762 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:42,445 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:42,781 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:43,274 | INFO | sshd-SshServer[55ec1453](port=8101)-timer-thread-1 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.14.0 | Disconnecting(ServerSessionImpl[karaf@/10.30.170.65:45864]): SSH2_DISCONNECT_PROTOCOL_ERROR - Detected IdleTimeout after 1800643/1800000 ms. 2025-09-06T02:15:43,801 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:44,821 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$up], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:45,841 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:46,075 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:46,862 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:47,882 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:48,902 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:49,921 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:50,234 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:50,941 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ap], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:51,265 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:51,962 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:52,294 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:52,981 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:54,002 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:55,021 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ep], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:56,041 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:56,445 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:57,061 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:58,082 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:59,045 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:15:59,102 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ip], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:15:59,574 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:00,122 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:00,604 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:01,221 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:02,242 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:03,261 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:04,281 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Np], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:05,301 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Op], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:05,815 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:06,322 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:06,335 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:06,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:07,342 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:08,362 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:09,382 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:10,402 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:10,475 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:11,426 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Up], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:11,514 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:12,444 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:13,502 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:14,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:14,522 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:15,541 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:16,561 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zp], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:17,582 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:18,601 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:18,785 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:19,621 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:19,825 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:20,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:20,642 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:20,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:21,662 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:22,415 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:22,682 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:23,454 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:23,701 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:24,722 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:24,996 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:25,742 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:26,765 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:27,565 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:27,781 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:28,805 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:29,824 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$aq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:31,235 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:32,252 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:32,794 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-3-shard-inventory-config#-401614540] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-09-06T02:16:32,795 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:33,272 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:34,292 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$eq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:34,854 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:35,312 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:36,332 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:36,415 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:37,352 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:37,975 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:38,372 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$iq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:39,392 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:40,055 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:40,412 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:41,431 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:42,135 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:42,452 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:43,473 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:43,695 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:44,492 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$oq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:45,512 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:46,532 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:47,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:47,552 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:48,571 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:48,885 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:49,592 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:50,612 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$uq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:51,631 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:52,005 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:52,652 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:53,672 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:54,595 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:54,691 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:55,711 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:56,732 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Aq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:57,752 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:58,235 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:58,755 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:16:58,772 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:59,792 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:16:59,795 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:00,812 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Eq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:01,355 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:01,833 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:01,875 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:02,856 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:02,915 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:03,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:03,872 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:04,476 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:04,892 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Iq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:04,994 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:05,912 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:06,932 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:07,594 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:07,952 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:08,972 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:09,154 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:09,991 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:10,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:11,013 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Oq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:12,032 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:13,052 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:14,072 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:15,092 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:16,112 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:17,131 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Uq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:18,152 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:19,172 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:20,192 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:21,212 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:22,232 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zq], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:23,166 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:23,252 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:23,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:24,272 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:25,292 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:25,766 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:26,312 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:27,332 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:27,845 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:28,352 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:29,372 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:30,392 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:31,411 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:32,432 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:33,045 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:33,452 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:34,472 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:35,492 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ar], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:35,646 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:36,519 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$br], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:37,542 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:38,562 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$dr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:39,732 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$er], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:40,753 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:41,772 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:42,792 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:43,812 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ir], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:44,832 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$jr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:45,852 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$kr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:46,044 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:46,872 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$lr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:47,895 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$mr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:48,912 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$nr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:49,932 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$or], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:50,953 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$pr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:51,972 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:52,993 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:53,834 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:17:54,012 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$sr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:55,032 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$tr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:56,052 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ur], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:57,072 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:58,092 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$wr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:17:59,112 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:00,132 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$yr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:01,152 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:02,144 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:02,172 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ar], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:03,192 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Br], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:04,212 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:05,233 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Dr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:06,253 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Er], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:07,272 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:08,292 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:09,313 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:10,332 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ir], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:11,352 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Jr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:12,373 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Kr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:13,392 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Lr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:14,412 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Mr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:15,433 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Nr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:16,453 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Or], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:17,473 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Pr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:18,492 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:19,512 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:20,533 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Sr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:21,552 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Tr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:22,572 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ur], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:23,591 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:24,612 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Wr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:25,632 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:26,652 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Yr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:27,672 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zr], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:28,692 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:29,712 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:30,732 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:31,753 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:32,774 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:33,794 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:34,345 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:34,812 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:35,832 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:36,853 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:37,872 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:38,892 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:39,913 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:40,932 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$as], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:41,953 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:42,145 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-33 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:42,973 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$cs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:43,705 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:43,992 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ds], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:44,734 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:45,012 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$es], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:46,032 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$fs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:47,052 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$gs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:47,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:18:48,072 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$hs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:49,093 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$is], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:50,113 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$js], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:51,133 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ks], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:52,152 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ls], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:53,172 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ms], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:54,192 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ns], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:55,212 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$os], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:56,233 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ps], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:57,252 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$qs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:58,273 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$rs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:18:59,293 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ss], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:00,313 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ts], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:01,333 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$us], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:02,353 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$vs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:03,373 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ws], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:04,393 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$xs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:05,413 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$ys], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:05,515 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:19:06,433 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$zs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:07,452 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$As], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:08,473 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Bs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:09,493 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Cs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:09,675 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-29 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:19:10,513 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ds], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:11,532 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Es], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:12,553 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Fs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:13,573 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Gs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:14,593 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Hs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:15,612 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Is], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:16,632 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Js], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:16,955 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:19:17,652 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ks], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:18,673 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ls], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:19,238 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification" | core | 112 - org.apache.karaf.log.core - 4.4.7 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification 2025-09-06T02:19:19,692 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ms], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:20,712 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ns], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:21,733 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Os], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:22,754 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ps], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:23,772 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Qs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:24,794 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Rs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:25,813 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ss], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:26,833 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ts], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:27,853 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Us], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:28,873 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Vs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:29,893 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ws], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:30,913 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Xs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:31,933 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Ys], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:32,953 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$Zs], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:33,973 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$0s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:34,993 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$1s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:36,012 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$2s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:37,032 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$3s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:38,053 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$4s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:39,076 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$5s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:40,093 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$6s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:41,113 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$7s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:42,133 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$8s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:42,394 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:19:43,153 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$9s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:44,173 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$+s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:44,985 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.0 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.248:2550], message stream] Upstream failed, cause: StreamTcpException: The connection has been aborted 2025-09-06T02:19:45,192 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$~s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:46,212 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$at], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-09-06T02:19:47,232 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.0 | member-3-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.171.195:2550/temp/_user_shardmanager-config_member-3-shard-inventory-config$bt], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false.